Re: Error using spark.driver.userClassPathFirst=true

2015-09-03 Thread Akhil Das
Its messing up your classpath, there was a discussion happened here
previously
https://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/spark-on-yarn-java-lang-UnsatisfiedLinkError-NativeCodeLoader/td-p/22724

Thanks
Best Regards

On Tue, Sep 1, 2015 at 4:58 PM, cgalan  wrote:

> Hi,
>
> When I am submitting a spark job in the mode "yarn-cluster" with the
> parameter "spark.driver.userClassPathFirst", my job fails; but if I don't
> use this params, my job is concluded with success.. My environment is some
> nodes with CDH5.4 and Spark 1.3.0.
>
> Spark submit with fail:
> spark-submit --class Main --master yarn-cluster --conf
> spark.driver.userClassPathFirst=true --conf
> spark.executor.userClassPathFirst=true Main.jar
>
> Spark-submit with success:
> spark-submit --class Main --master yarn-cluster Main.jar
>
> Error with Snappy:
> java.lang.UnsatisfiedLinkError:
> org.xerial.snappy.SnappyNative.maxCompressedLength(I)I
> at org.xerial.snappy.SnappyNative.maxCompressedLength(Native
> Method)
> at org.xerial.snappy.Snappy.maxCompressedLength(Snappy.java:316)
> at
> org.xerial.snappy.SnappyOutputStream.(SnappyOutputStream.java:79)
> at
>
> org.apache.spark.io.SnappyCompressionCodec.compressedOutputStream(CompressionCodec.scala:157)
> at
>
> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$4.apply(TorrentBroadcast.scala:199)
> at
>
> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$4.apply(TorrentBroadcast.scala:199)
> at scala.Option.map(Option.scala:145)
> at
>
> org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:199)
> at
>
> org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:101)
> at
>
> org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:84)
> at
>
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
> at
>
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
> at
>
> org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
> at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1051)
> at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:761)
> at org.apache.spark.SparkContext.textFile(SparkContext.scala:589)
>
> My example error code:
> public static void main(String[] args) {
>
> SparkConf conf = new SparkConf().setAppName("Prueba");
> //  conf.setMaster("local"); //Comentar para lanzar en el
> cluster
>
> JavaSparkContext sc = new JavaSparkContext(conf);
> JavaRDD asd = sc.textFile("text.txt");
> asd.count();
>
> sc.close();
> }
>
> Does anyone have any suggestions? The problem of why I need to use
> "spark.driver.userClassPathFirst=true" is because I use commons-cli-1.3.1
> in
> my project, and the spark classpath have one previous version, so  I create
> a shade-jar, but without "spark.driver.userClassPathFirst=true" I have
> problems of conflicts with the dependencies between my spark.jar and spark
> classpath that one class uses the previous class version of classpath
> instead of the last version.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Error-using-spark-driver-userClassPathFirst-true-tp24536.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Error using spark.driver.userClassPathFirst=true

2015-09-01 Thread cgalan
Hi,

When I am submitting a spark job in the mode "yarn-cluster" with the
parameter "spark.driver.userClassPathFirst", my job fails; but if I don't
use this params, my job is concluded with success.. My environment is some
nodes with CDH5.4 and Spark 1.3.0.

Spark submit with fail:
spark-submit --class Main --master yarn-cluster --conf
spark.driver.userClassPathFirst=true --conf
spark.executor.userClassPathFirst=true Main.jar

Spark-submit with success:
spark-submit --class Main --master yarn-cluster Main.jar

Error with Snappy:
java.lang.UnsatisfiedLinkError:
org.xerial.snappy.SnappyNative.maxCompressedLength(I)I
at org.xerial.snappy.SnappyNative.maxCompressedLength(Native Method)
at org.xerial.snappy.Snappy.maxCompressedLength(Snappy.java:316)
at 
org.xerial.snappy.SnappyOutputStream.(SnappyOutputStream.java:79)
at
org.apache.spark.io.SnappyCompressionCodec.compressedOutputStream(CompressionCodec.scala:157)
at
org.apache.spark.broadcast.TorrentBroadcast$$anonfun$4.apply(TorrentBroadcast.scala:199)
at
org.apache.spark.broadcast.TorrentBroadcast$$anonfun$4.apply(TorrentBroadcast.scala:199)
at scala.Option.map(Option.scala:145)
at
org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:199)
at
org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:101)
at
org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:84)
at
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
at
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
at
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1051)
at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:761)
at org.apache.spark.SparkContext.textFile(SparkContext.scala:589)

My example error code:
public static void main(String[] args) {

SparkConf conf = new SparkConf().setAppName("Prueba");
//  conf.setMaster("local"); //Comentar para lanzar en el cluster

JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD asd = sc.textFile("text.txt");
asd.count();

sc.close();
}

Does anyone have any suggestions? The problem of why I need to use
"spark.driver.userClassPathFirst=true" is because I use commons-cli-1.3.1 in
my project, and the spark classpath have one previous version, so  I create
a shade-jar, but without "spark.driver.userClassPathFirst=true" I have
problems of conflicts with the dependencies between my spark.jar and spark
classpath that one class uses the previous class version of classpath
instead of the last version.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Error-using-spark-driver-userClassPathFirst-true-tp24536.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org