Re: Spark not working with Hadoop 4mc compression

2018-12-20 Thread Jiaan Geng
I think com.hadoop.compression.lzo.LzoCodec not in spark classpaths,please put suitable hadoop-lzo.jar into directory spark_home/jars/. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe

Spark not working with Hadoop 4mc compression

2018-12-19 Thread Abhijeet Kumar
Hello, I’m using 4mc compression in my Hadoop and when I’m reading file from hdfs, it’s throwing error. https://github.com/carlomedas/4mc I’m doing simple query in sc.textFile("/store.csv").getNumPartitions Error: java.lang.RuntimeException: Error in configuring object at