Re: Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Ted Yu
Please also see this thread:
http://search-hadoop.com/m/JW1q5De7pU1

On Tue, Jan 20, 2015 at 3:58 PM, Sean Owen so...@cloudera.com wrote:

 Guava is shaded in Spark 1.2+. It looks like you are mixing versions
 of Spark then, with some that still refer to unshaded Guava. Make sure
 you are not packaging Spark with your app and that you don't have
 other versions lying around.

 On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari sbirar...@gmail.com
 wrote:
  Hello,
 
  I recently upgraded my setup from Spark 1.1 to Spark 1.2.
  My existing applications are working fine on ubuntu cluster.
  But, when I try to execute Spark MLlib application from Eclipse (Windows
  node) it gives java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions exception.
 
  Note,
 1. With Spark 1.1 this was working fine.
 2. The Spark 1.2 jar files are linked in Eclipse project.
 3. Checked the jar -tf output and found the above
 com.google.common.base
  is not present.
 
 
 -Exception
  log:
 
  Exception in thread main java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions
  at
 
 org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)
  at
 
 org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
  at
 
 org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
  at
 org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
  at org.apache.spark.SparkContext.init(SparkContext.scala:340)
  at
 
 org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
  at
 org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.google.common.base.Preconditions
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  ... 7 more
 
 
 -
 
  jar -tf output:
 
 
  consb2@CONSB2A
  /cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
  $ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
  org/spark-project/guava/common/base/Preconditions.class
  org/spark-project/guava/common/math/MathPreconditions.class
  com/clearspring/analytics/util/Preconditions.class
  parquet/Preconditions.class
  com/google/inject/internal/util/$Preconditions.class
 
 
 ---
 
  Please help me in resolving this.
 
  Thanks,
Shailesh
 
 
 
 
  --
  View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-com-google-common-base-Preconditions-java-lang-NoClassDefFoundErro-tp21271.html
  Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Sean Owen
Guava is shaded in Spark 1.2+. It looks like you are mixing versions
of Spark then, with some that still refer to unshaded Guava. Make sure
you are not packaging Spark with your app and that you don't have
other versions lying around.

On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari sbirar...@gmail.com wrote:
 Hello,

 I recently upgraded my setup from Spark 1.1 to Spark 1.2.
 My existing applications are working fine on ubuntu cluster.
 But, when I try to execute Spark MLlib application from Eclipse (Windows
 node) it gives java.lang.NoClassDefFoundError:
 com/google/common/base/Preconditions exception.

 Note,
1. With Spark 1.1 this was working fine.
2. The Spark 1.2 jar files are linked in Eclipse project.
3. Checked the jar -tf output and found the above com.google.common.base
 is not present.

 -Exception
 log:

 Exception in thread main java.lang.NoClassDefFoundError:
 com/google/common/base/Preconditions
 at
 org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)
 at
 org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
 at
 org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
 at 
 org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
 at org.apache.spark.SparkContext.init(SparkContext.scala:340)
 at
 org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
 at 
 org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
 Caused by: java.lang.ClassNotFoundException:
 com.google.common.base.Preconditions
 at java.net.URLClassLoader$1.run(Unknown Source)
 at java.net.URLClassLoader$1.run(Unknown Source)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(Unknown Source)
 at java.lang.ClassLoader.loadClass(Unknown Source)
 at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
 at java.lang.ClassLoader.loadClass(Unknown Source)
 ... 7 more

 -

 jar -tf output:


 consb2@CONSB2A
 /cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
 $ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
 org/spark-project/guava/common/base/Preconditions.class
 org/spark-project/guava/common/math/MathPreconditions.class
 com/clearspring/analytics/util/Preconditions.class
 parquet/Preconditions.class
 com/google/inject/internal/util/$Preconditions.class

 ---

 Please help me in resolving this.

 Thanks,
   Shailesh




 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-com-google-common-base-Preconditions-java-lang-NoClassDefFoundErro-tp21271.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Shailesh Birari
Hello,

I recently upgraded my setup from Spark 1.1 to Spark 1.2.
My existing applications are working fine on ubuntu cluster.
But, when I try to execute Spark MLlib application from Eclipse (Windows
node) it gives java.lang.NoClassDefFoundError:
com/google/common/base/Preconditions exception.

Note, 
   1. With Spark 1.1 this was working fine.
   2. The Spark 1.2 jar files are linked in Eclipse project.
   3. Checked the jar -tf output and found the above com.google.common.base
is not present.

-Exception
log:

Exception in thread main java.lang.NoClassDefFoundError:
com/google/common/base/Preconditions
at
org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)
at
org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
at
org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
at 
org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
at org.apache.spark.SparkContext.init(SparkContext.scala:340)
at
org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
at 
org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
Caused by: java.lang.ClassNotFoundException:
com.google.common.base.Preconditions
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 7 more

-

jar -tf output:


consb2@CONSB2A
/cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
$ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
org/spark-project/guava/common/base/Preconditions.class
org/spark-project/guava/common/math/MathPreconditions.class
com/clearspring/analytics/util/Preconditions.class
parquet/Preconditions.class
com/google/inject/internal/util/$Preconditions.class

---

Please help me in resolving this.

Thanks,
  Shailesh




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-com-google-common-base-Preconditions-java-lang-NoClassDefFoundErro-tp21271.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Aaron Davidson
Spark's network-common package depends on guava as a provided dependency
in order to avoid conflicting with other libraries (e.g., Hadoop) that
depend on specific versions. com/google/common/base/Preconditions has been
present in Guava since version 2, so this is likely a dependency not
found rather than wrong version of dependency issue.

To resolve this, please depend on some version of Guava (14.0.1 is
guaranteed to work, as should any other version from the past few years).

On Tue, Jan 20, 2015 at 6:16 PM, Shailesh Birari sbirar...@gmail.com
wrote:

 Hi Frank,

 Its a normal eclipse project where I added Scala and Spark libraries as
 user libraries.
 Though, I am not attaching any hadoop libraries, in my application code I
 have following line.

   System.setProperty(hadoop.home.dir, C:\\SB\\HadoopWin)

 This Hadoop home dir contains winutils.exe only.

 Don't think that its an issue.

 Please suggest.

 Thanks,
   Shailesh


 On Wed, Jan 21, 2015 at 2:19 PM, Frank Austin Nothaft 
 fnoth...@berkeley.edu wrote:

 Shailesh,

 To add, are you packaging Hadoop in your app? Hadoop will pull in Guava.
 Not sure if you are using Maven (or what) to build, but if you can pull up
 your builds dependency tree, you will likely find com.google.guava being
 brought in by one of your dependencies.

 Regards,

 Frank Austin Nothaft
 fnoth...@berkeley.edu
 fnoth...@eecs.berkeley.edu
 202-340-0466

 On Jan 20, 2015, at 5:13 PM, Shailesh Birari sbirar...@gmail.com wrote:

 Hello,

 I double checked the libraries. I am linking only with Spark 1.2.
 Along with Spark 1.2 jars I have Scala 2.10 jars and JRE 7 jars linked
 and nothing else.

 Thanks,
Shailesh

 On Wed, Jan 21, 2015 at 12:58 PM, Sean Owen so...@cloudera.com wrote:

 Guava is shaded in Spark 1.2+. It looks like you are mixing versions
 of Spark then, with some that still refer to unshaded Guava. Make sure
 you are not packaging Spark with your app and that you don't have
 other versions lying around.

 On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari sbirar...@gmail.com
 wrote:
  Hello,
 
  I recently upgraded my setup from Spark 1.1 to Spark 1.2.
  My existing applications are working fine on ubuntu cluster.
  But, when I try to execute Spark MLlib application from Eclipse
 (Windows
  node) it gives java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions exception.
 
  Note,
 1. With Spark 1.1 this was working fine.
 2. The Spark 1.2 jar files are linked in Eclipse project.
 3. Checked the jar -tf output and found the above
 com.google.common.base
  is not present.
 
 
 -Exception
  log:
 
  Exception in thread main java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions
  at
 
 org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)
  at
 
 org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
  at
 
 org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
  at
 org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
  at org.apache.spark.SparkContext.init(SparkContext.scala:340)
  at
 
 org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
  at
 org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.google.common.base.Preconditions
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  ... 7 more
 
 
 -
 
  jar -tf output:
 
 
  consb2@CONSB2A
  /cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
  $ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
  org/spark-project/guava/common/base/Preconditions.class
  org/spark-project/guava/common/math/MathPreconditions.class
  com/clearspring/analytics/util/Preconditions.class
  parquet/Preconditions.class
  com/google/inject/internal/util/$Preconditions.class
 
 
 ---
 
  Please help me in resolving this.
 
  Thanks,
Shailesh
 
 
 
 
  --
  View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-com-google-common-base-Preconditions-java-lang-NoClassDefFoundErro-tp21271.html
  Sent from the Apache 

Re: Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Shailesh Birari
Thanks Aaron.

Adding Guava jar resolves the issue.

Shailesh

On Wed, Jan 21, 2015 at 3:26 PM, Aaron Davidson ilike...@gmail.com wrote:

 Spark's network-common package depends on guava as a provided dependency
 in order to avoid conflicting with other libraries (e.g., Hadoop) that
 depend on specific versions. com/google/common/base/Preconditions has been
 present in Guava since version 2, so this is likely a dependency not
 found rather than wrong version of dependency issue.

 To resolve this, please depend on some version of Guava (14.0.1 is
 guaranteed to work, as should any other version from the past few years).

 On Tue, Jan 20, 2015 at 6:16 PM, Shailesh Birari sbirar...@gmail.com
 wrote:

 Hi Frank,

 Its a normal eclipse project where I added Scala and Spark libraries as
 user libraries.
 Though, I am not attaching any hadoop libraries, in my application code I
 have following line.

   System.setProperty(hadoop.home.dir, C:\\SB\\HadoopWin)

 This Hadoop home dir contains winutils.exe only.

 Don't think that its an issue.

 Please suggest.

 Thanks,
   Shailesh


 On Wed, Jan 21, 2015 at 2:19 PM, Frank Austin Nothaft 
 fnoth...@berkeley.edu wrote:

 Shailesh,

 To add, are you packaging Hadoop in your app? Hadoop will pull in Guava.
 Not sure if you are using Maven (or what) to build, but if you can pull up
 your builds dependency tree, you will likely find com.google.guava being
 brought in by one of your dependencies.

 Regards,

 Frank Austin Nothaft
 fnoth...@berkeley.edu
 fnoth...@eecs.berkeley.edu
 202-340-0466

 On Jan 20, 2015, at 5:13 PM, Shailesh Birari sbirar...@gmail.com
 wrote:

 Hello,

 I double checked the libraries. I am linking only with Spark 1.2.
 Along with Spark 1.2 jars I have Scala 2.10 jars and JRE 7 jars linked
 and nothing else.

 Thanks,
Shailesh

 On Wed, Jan 21, 2015 at 12:58 PM, Sean Owen so...@cloudera.com wrote:

 Guava is shaded in Spark 1.2+. It looks like you are mixing versions
 of Spark then, with some that still refer to unshaded Guava. Make sure
 you are not packaging Spark with your app and that you don't have
 other versions lying around.

 On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari sbirar...@gmail.com
 wrote:
  Hello,
 
  I recently upgraded my setup from Spark 1.1 to Spark 1.2.
  My existing applications are working fine on ubuntu cluster.
  But, when I try to execute Spark MLlib application from Eclipse
 (Windows
  node) it gives java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions exception.
 
  Note,
 1. With Spark 1.1 this was working fine.
 2. The Spark 1.2 jar files are linked in Eclipse project.
 3. Checked the jar -tf output and found the above
 com.google.common.base
  is not present.
 
 
 -Exception
  log:
 
  Exception in thread main java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions
  at
 
 org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)
  at
 
 org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
  at
 
 org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
  at
 org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
  at
 org.apache.spark.SparkContext.init(SparkContext.scala:340)
  at
 
 org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
  at
 org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.google.common.base.Preconditions
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  ... 7 more
 
 
 -
 
  jar -tf output:
 
 
  consb2@CONSB2A
  /cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
  $ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
  org/spark-project/guava/common/base/Preconditions.class
  org/spark-project/guava/common/math/MathPreconditions.class
  com/clearspring/analytics/util/Preconditions.class
  parquet/Preconditions.class
  com/google/inject/internal/util/$Preconditions.class
 
 
 ---
 
  Please help me in resolving this.
 
  Thanks,
Shailesh
 
 
 
 
  --
  View this message in context:
 

Re: Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Shailesh Birari
Hi Frank,

Its a normal eclipse project where I added Scala and Spark libraries as
user libraries.
Though, I am not attaching any hadoop libraries, in my application code I
have following line.

  System.setProperty(hadoop.home.dir, C:\\SB\\HadoopWin)

This Hadoop home dir contains winutils.exe only.

Don't think that its an issue.

Please suggest.

Thanks,
  Shailesh


On Wed, Jan 21, 2015 at 2:19 PM, Frank Austin Nothaft fnoth...@berkeley.edu
 wrote:

 Shailesh,

 To add, are you packaging Hadoop in your app? Hadoop will pull in Guava.
 Not sure if you are using Maven (or what) to build, but if you can pull up
 your builds dependency tree, you will likely find com.google.guava being
 brought in by one of your dependencies.

 Regards,

 Frank Austin Nothaft
 fnoth...@berkeley.edu
 fnoth...@eecs.berkeley.edu
 202-340-0466

 On Jan 20, 2015, at 5:13 PM, Shailesh Birari sbirar...@gmail.com wrote:

 Hello,

 I double checked the libraries. I am linking only with Spark 1.2.
 Along with Spark 1.2 jars I have Scala 2.10 jars and JRE 7 jars linked and
 nothing else.

 Thanks,
Shailesh

 On Wed, Jan 21, 2015 at 12:58 PM, Sean Owen so...@cloudera.com wrote:

 Guava is shaded in Spark 1.2+. It looks like you are mixing versions
 of Spark then, with some that still refer to unshaded Guava. Make sure
 you are not packaging Spark with your app and that you don't have
 other versions lying around.

 On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari sbirar...@gmail.com
 wrote:
  Hello,
 
  I recently upgraded my setup from Spark 1.1 to Spark 1.2.
  My existing applications are working fine on ubuntu cluster.
  But, when I try to execute Spark MLlib application from Eclipse (Windows
  node) it gives java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions exception.
 
  Note,
 1. With Spark 1.1 this was working fine.
 2. The Spark 1.2 jar files are linked in Eclipse project.
 3. Checked the jar -tf output and found the above
 com.google.common.base
  is not present.
 
 
 -Exception
  log:
 
  Exception in thread main java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions
  at
 
 org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)
  at
 
 org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
  at
 
 org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
  at
 org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
  at org.apache.spark.SparkContext.init(SparkContext.scala:340)
  at
 
 org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
  at
 org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.google.common.base.Preconditions
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  ... 7 more
 
 
 -
 
  jar -tf output:
 
 
  consb2@CONSB2A
  /cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
  $ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
  org/spark-project/guava/common/base/Preconditions.class
  org/spark-project/guava/common/math/MathPreconditions.class
  com/clearspring/analytics/util/Preconditions.class
  parquet/Preconditions.class
  com/google/inject/internal/util/$Preconditions.class
 
 
 ---
 
  Please help me in resolving this.
 
  Thanks,
Shailesh
 
 
 
 
  --
  View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-com-google-common-base-Preconditions-java-lang-NoClassDefFoundErro-tp21271.html
  Sent from the Apache Spark User List mailing list archive at Nabble.com
 .
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 






Re: Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Shailesh Birari
Hello,

I double checked the libraries. I am linking only with Spark 1.2.
Along with Spark 1.2 jars I have Scala 2.10 jars and JRE 7 jars linked and
nothing else.

Thanks,
   Shailesh

On Wed, Jan 21, 2015 at 12:58 PM, Sean Owen so...@cloudera.com wrote:

 Guava is shaded in Spark 1.2+. It looks like you are mixing versions
 of Spark then, with some that still refer to unshaded Guava. Make sure
 you are not packaging Spark with your app and that you don't have
 other versions lying around.

 On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari sbirar...@gmail.com
 wrote:
  Hello,
 
  I recently upgraded my setup from Spark 1.1 to Spark 1.2.
  My existing applications are working fine on ubuntu cluster.
  But, when I try to execute Spark MLlib application from Eclipse (Windows
  node) it gives java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions exception.
 
  Note,
 1. With Spark 1.1 this was working fine.
 2. The Spark 1.2 jar files are linked in Eclipse project.
 3. Checked the jar -tf output and found the above
 com.google.common.base
  is not present.
 
 
 -Exception
  log:
 
  Exception in thread main java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions
  at
 
 org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)
  at
 
 org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
  at
 
 org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
  at
 org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
  at org.apache.spark.SparkContext.init(SparkContext.scala:340)
  at
 
 org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
  at
 org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.google.common.base.Preconditions
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  ... 7 more
 
 
 -
 
  jar -tf output:
 
 
  consb2@CONSB2A
  /cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
  $ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
  org/spark-project/guava/common/base/Preconditions.class
  org/spark-project/guava/common/math/MathPreconditions.class
  com/clearspring/analytics/util/Preconditions.class
  parquet/Preconditions.class
  com/google/inject/internal/util/$Preconditions.class
 
 
 ---
 
  Please help me in resolving this.
 
  Thanks,
Shailesh
 
 
 
 
  --
  View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-com-google-common-base-Preconditions-java-lang-NoClassDefFoundErro-tp21271.html
  Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 



Re: Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Frank Austin Nothaft
Shailesh,

To add, are you packaging Hadoop in your app? Hadoop will pull in Guava. Not 
sure if you are using Maven (or what) to build, but if you can pull up your 
builds dependency tree, you will likely find com.google.guava being brought in 
by one of your dependencies.

Regards,

Frank Austin Nothaft
fnoth...@berkeley.edu
fnoth...@eecs.berkeley.edu
202-340-0466

On Jan 20, 2015, at 5:13 PM, Shailesh Birari sbirar...@gmail.com wrote:

 Hello,
 
 I double checked the libraries. I am linking only with Spark 1.2.
 Along with Spark 1.2 jars I have Scala 2.10 jars and JRE 7 jars linked and 
 nothing else.
 
 Thanks,
Shailesh
 
 On Wed, Jan 21, 2015 at 12:58 PM, Sean Owen so...@cloudera.com wrote:
 Guava is shaded in Spark 1.2+. It looks like you are mixing versions
 of Spark then, with some that still refer to unshaded Guava. Make sure
 you are not packaging Spark with your app and that you don't have
 other versions lying around.
 
 On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari sbirar...@gmail.com wrote:
  Hello,
 
  I recently upgraded my setup from Spark 1.1 to Spark 1.2.
  My existing applications are working fine on ubuntu cluster.
  But, when I try to execute Spark MLlib application from Eclipse (Windows
  node) it gives java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions exception.
 
  Note,
 1. With Spark 1.1 this was working fine.
 2. The Spark 1.2 jar files are linked in Eclipse project.
 3. Checked the jar -tf output and found the above com.google.common.base
  is not present.
 
  -Exception
  log:
 
  Exception in thread main java.lang.NoClassDefFoundError:
  com/google/common/base/Preconditions
  at
  org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)
  at
  org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
  at
  org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
  at 
  org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
  at org.apache.spark.SparkContext.init(SparkContext.scala:340)
  at
  org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
  at 
  org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.google.common.base.Preconditions
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.net.URLClassLoader$1.run(Unknown Source)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
  at java.lang.ClassLoader.loadClass(Unknown Source)
  ... 7 more
 
  -
 
  jar -tf output:
 
 
  consb2@CONSB2A
  /cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
  $ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
  org/spark-project/guava/common/base/Preconditions.class
  org/spark-project/guava/common/math/MathPreconditions.class
  com/clearspring/analytics/util/Preconditions.class
  parquet/Preconditions.class
  com/google/inject/internal/util/$Preconditions.class
 
  ---
 
  Please help me in resolving this.
 
  Thanks,
Shailesh
 
 
 
 
  --
  View this message in context: 
  http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-com-google-common-base-Preconditions-java-lang-NoClassDefFoundErro-tp21271.html
  Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 
 



RE: Spark 1.2 - com/google/common/base/Preconditions java.lang.NoClassDefFoundErro

2015-01-20 Thread Bob Tiernay
If using Maven, one simply use whatever version they prefer and at build time 
and the artifact using something like:
  buildplugins  plugin
groupIdorg.apache.maven.plugins/groupId
artifactIdmaven-shade-plugin/artifactIdexecutions  
executionphasepackage/phasegoals  
goalshade/goal/goalsconfiguration
  relocationsrelocation  
patterncom.google.common/pattern  
shadedPatternorg.shaded.google.common/shadedPattern
/relocation  /relocations

/configuration  /execution/executions  
/plugin/plugins  /build




Of course this won't help during development if there are conflicts.

From: ilike...@gmail.com
Date: Tue, 20 Jan 2015 18:26:32 -0800
Subject: Re: Spark 1.2 - com/google/common/base/Preconditions 
java.lang.NoClassDefFoundErro
To: sbirar...@gmail.com
CC: fnoth...@berkeley.edu; so...@cloudera.com; user@spark.apache.org

Spark's network-common package depends on guava as a provided dependency in 
order to avoid conflicting with other libraries (e.g., Hadoop) that depend on 
specific versions. com/google/common/base/Preconditions has been present in 
Guava since version 2, so this is likely a dependency not found rather than 
wrong version of dependency issue.
To resolve this, please depend on some version of Guava (14.0.1 is guaranteed 
to work, as should any other version from the past few years).
On Tue, Jan 20, 2015 at 6:16 PM, Shailesh Birari sbirar...@gmail.com wrote:
Hi Frank,
Its a normal eclipse project where I added Scala and Spark libraries as user 
libraries.Though, I am not attaching any hadoop libraries, in my application 
code I have following line.
  System.setProperty(hadoop.home.dir, C:\\SB\\HadoopWin)

This Hadoop home dir contains winutils.exe only.
Don't think that its an issue.
Please suggest.
Thanks,  Shailesh

On Wed, Jan 21, 2015 at 2:19 PM, Frank Austin Nothaft fnoth...@berkeley.edu 
wrote:
Shailesh,
To add, are you packaging Hadoop in your app? Hadoop will pull in Guava. Not 
sure if you are using Maven (or what) to build, but if you can pull up your 
builds dependency tree, you will likely find com.google.guava being brought in 
by one of your dependencies.
Regards,

Frank Austin Nothaftfnothaft@berkeley.edufnothaft@eecs.berkeley.edu202-340-0466



On Jan 20, 2015, at 5:13 PM, Shailesh Birari sbirar...@gmail.com wrote:
Hello,
I double checked the libraries. I am linking only with Spark 1.2.Along with 
Spark 1.2 jars I have Scala 2.10 jars and JRE 7 jars linked and nothing else.
Thanks,   Shailesh
On Wed, Jan 21, 2015 at 12:58 PM, Sean Owen so...@cloudera.com wrote:
Guava is shaded in Spark 1.2+. It looks like you are mixing versions

of Spark then, with some that still refer to unshaded Guava. Make sure

you are not packaging Spark with your app and that you don't have

other versions lying around.



On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari sbirar...@gmail.com wrote:

 Hello,



 I recently upgraded my setup from Spark 1.1 to Spark 1.2.

 My existing applications are working fine on ubuntu cluster.

 But, when I try to execute Spark MLlib application from Eclipse (Windows

 node) it gives java.lang.NoClassDefFoundError:

 com/google/common/base/Preconditions exception.



 Note,

1. With Spark 1.1 this was working fine.

2. The Spark 1.2 jar files are linked in Eclipse project.

3. Checked the jar -tf output and found the above com.google.common.base

 is not present.



 -Exception

 log:



 Exception in thread main java.lang.NoClassDefFoundError:

 com/google/common/base/Preconditions

 at

 org.apache.spark.network.client.TransportClientFactory.init(TransportClientFactory.java:94)

 at

 org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)

 at

 org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)

 at 
 org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)

 at org.apache.spark.SparkContext.init(SparkContext.scala:340)

 at

 org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)

 at 
 org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)

 Caused by: java.lang.ClassNotFoundException:

 com.google.common.base.Preconditions

 at java.net.URLClassLoader$1.run(Unknown Source)

 at java.net.URLClassLoader$1.run(Unknown Source)

 at java.security.AccessController.doPrivileged(Native Method)

 at java.net.URLClassLoader.findClass(Unknown Source)

 at java.lang.ClassLoader.loadClass(Unknown Source)

 at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source