I'm using Spark 0.8.0 and Shark 0.8.0. Upon creating a cached table in memory using Shark, the Spark UI indicates that the type of storage level used is 'Disk Memory Deserialized 1x Replicated'. I had the assumption that Memory Only is the default storage level in Spark. Did that change in 0.8.0? And how could I change the storage level in Spark?