Default Storage Level in Spark

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Default Storage Level in Spark

Mskh
Hi,

I'm using Spark 0.8.0 and Shark 0.8.0. Upon creating a cached table in memory using Shark, the Spark UI indicates that the type of storage level used is 'Disk Memory Deserialized 1x Replicated'. I had the assumption that Memory Only is the default storage level in Spark. Did that change in 0.8.0? And how could I change the storage level in Spark?

Thanks
Majd