read snappy compressed files in spark

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

read snappy compressed files in spark

Ricky
I wanna be able to read snappy compressed files in spark. I can do a 
val df = spark.read.textFile("hdfs:// path") 
and it passes that test in spark shell but beyond that when i do a df.show(10,false) or something - it shows me binary data mixed with real text - how do I read the decompressed file in spark - I can build a dataframe reader if someone guides or nudges me in right direction ... 




Reply | Threaded
Open this post in threaded view
|

Re: read snappy compressed files in spark

yujhe.li
What's your Spark version?
Do you have added hadoop native library to your path? like
"spark.executor.extraJavaOptions -Djava.library.path=/hadoop-native/" in
spark-defaults.conf.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]