Thanks Patrick and Cheng for the suggestions.

The issue was Hadoop common jar was added to a classpath. After I removed 
Hadoop common jar from both master and slave, I was able to bypass the error. 
This was caused by a local change, so no impact on the 1.2 release. 
-----Original Message-----
From: Patrick Wendell [mailto:pwend...@gmail.com] 
Sent: Wednesday, November 26, 2014 8:17 AM
To: Judy Nash
Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Just to double check - I looked at our own assembly jar and I confirmed that 
our Hadoop configuration class does use the correctly shaded version of Guava. 
My best guess here is that somehow a separate Hadoop library is ending up on 
the classpath, possible because Spark put it there somehow.

> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
> cd org/apache/hadoop/
> javap -v Configuration | grep Precond

Warning: Binary file Configuration contains org.apache.hadoop.conf.Configuration

   #497 = Utf8               org/spark-project/guava/common/base/Preconditions

   #498 = Class              #497         //
"org/spark-project/guava/common/base/Preconditions"

   #502 = Methodref          #498.#501    //
"org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLjava/lang/Object;)V

        12: invokestatic  #502                // Method
"org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V

        50: invokestatic  #502                // Method
"org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V

On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell <pwend...@gmail.com> wrote:
> Hi Judy,
>
> Are you somehow modifying Spark's classpath to include jars from 
> Hadoop and Hive that you have running on the machine? The issue seems 
> to be that you are somehow including a version of Hadoop that 
> references the original guava package. The Hadoop that is bundled in 
> the Spark jars should not do this.
>
> - Patrick
>
> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash 
> <judyn...@exchange.microsoft.com> wrote:
>> Looks like a config issue. I ran spark-pi job and still failing with 
>> the same guava error
>>
>> Command ran:
>>
>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077 
>> --executor-memory 1G --num-executors 1 
>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>
>>
>>
>> Had used the same build steps on spark 1.1 and had no issue.
>>
>>
>>
>> From: Denny Lee [mailto:denny.g....@gmail.com]
>> Sent: Tuesday, November 25, 2014 5:47 PM
>> To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org
>>
>>
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> To determine if this is a Windows vs. other configuration, can you 
>> just try to call the Spark-class.cmd SparkSubmit without actually 
>> referencing the Hadoop or Thrift server classes?
>>
>>
>>
>>
>>
>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
>> <judyn...@exchange.microsoft.com>
>> wrote:
>>
>> I traced the code and used the following to call:
>>
>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 
>> spark-internal --hiveconf hive.server2.thrift.port=10000
>>
>>
>>
>> The issue ended up to be much more fundamental however. Spark doesn't 
>> work at all in configuration below. When open spark-shell, it fails 
>> with the same ClassNotFound error.
>>
>> Now I wonder if this is a windows-only issue or the hive/Hadoop 
>> configuration that is having this problem.
>>
>>
>>
>> From: Cheng Lian [mailto:lian.cs....@gmail.com]
>> Sent: Tuesday, November 25, 2014 1:50 AM
>>
>>
>> To: Judy Nash; u...@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> Oh so you're using Windows. What command are you using to start the 
>> Thrift server then?
>>
>> On 11/25/14 4:25 PM, Judy Nash wrote:
>>
>> Made progress but still blocked.
>>
>> After recompiling the code on cmd instead of PowerShell, now I can 
>> see all 5 classes as you mentioned.
>>
>> However I am still seeing the same error as before. Anything else I 
>> can check for?
>>
>>
>>
>> From: Judy Nash [mailto:judyn...@exchange.microsoft.com]
>> Sent: Monday, November 24, 2014 11:50 PM
>> To: Cheng Lian; u...@spark.incubator.apache.org
>> Subject: RE: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> This is what I got from jar tf:
>>
>> org/spark-project/guava/common/base/Preconditions.class
>>
>> org/spark-project/guava/common/math/MathPreconditions.class
>>
>> com/clearspring/analytics/util/Preconditions.class
>>
>> parquet/Preconditions.class
>>
>>
>>
>> I seem to have the line that reported missing, but I am missing this file:
>>
>> com/google/inject/internal/util/$Preconditions.class
>>
>>
>>
>> Any suggestion on how to fix this?
>>
>> Very much appreciate the help as I am very new to Spark and open 
>> source technologies.
>>
>>
>>
>> From: Cheng Lian [mailto:lian.cs....@gmail.com]
>> Sent: Monday, November 24, 2014 8:24 PM
>> To: Judy Nash; u...@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> Hm, I tried exactly the same commit and the build command locally, 
>> but couldn't reproduce this.
>>
>> Usually this kind of errors are caused by classpath misconfiguration. 
>> Could you please try this to ensure corresponding Guava classes are 
>> included in the assembly jar you built?
>>
>> jar tf
>> assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.
>> jar | grep Preconditions
>>
>> On my machine I got these lines (the first line is the one reported 
>> as missing in your case):
>>
>> org/spark-project/guava/common/base/Preconditions.class
>>
>> org/spark-project/guava/common/math/MathPreconditions.class
>>
>> com/clearspring/analytics/util/Preconditions.class
>>
>> parquet/Preconditions.class
>>
>> com/google/inject/internal/util/$Preconditions.class
>>
>> On 11/25/14 6:25 AM, Judy Nash wrote:
>>
>> Thank you Cheng for responding.
>>
>>
>> Here is the commit SHA1 on the 1.2 branch I saw this failure in:
>>
>> commit 6f70e0295572e3037660004797040e026e440dbd
>>
>> Author: zsxwing <zsxw...@gmail.com>
>>
>> Date:   Fri Nov 21 00:42:43 2014 -0800
>>
>>
>>
>>     [SPARK-4472][Shell] Print "Spark context available as sc." only 
>> when SparkContext is created...
>>
>>
>>
>>     ... successfully
>>
>>
>>
>>     It's weird that printing "Spark context available as sc" when 
>> creating SparkContext unsuccessfully.
>>
>>
>>
>> Let me know if you need anything else.
>>
>>
>>
>> From: Cheng Lian [mailto:lian.cs....@gmail.com]
>> Sent: Friday, November 21, 2014 8:02 PM
>> To: Judy Nash; u...@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> Hi Judy, could you please provide the commit SHA1 of the version 
>> you're using? Thanks!
>>
>> On 11/22/14 11:05 AM, Judy Nash wrote:
>>
>> Hi,
>>
>>
>>
>> Thrift server is failing to start for me on latest spark 1.2 branch.
>>
>>
>>
>> I got the error below when I start thrift server.
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> com/google/common/bas
>>
>> e/Preconditions
>>
>>         at
>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>
>> ation.java:314)....
>>
>>
>>
>> Here is my setup:
>>
>> 1)      Latest spark 1.2 branch build
>>
>> 2)      Used build command:
>>
>> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive 
>> -Phive-thriftserver -DskipTests clean package
>>
>> 3)      Added hive-site.xml to \conf
>>
>> 4)      Version on the box: Hive 0.13, Hadoop 2.4
>>
>>
>>
>> Is this a real bug or am I doing something wrong?
>>
>>
>>
>> -----------------------------------
>>
>> Full Stacktrace:
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> com/google/common/bas
>>
>> e/Preconditions
>>
>>         at
>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>
>> ation.java:314)
>>
>>         at
>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>
>> ation.java:327)
>>
>>         at
>> org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:409)
>>
>>
>>
>>         at
>> org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU
>>
>> til.scala:82)
>>
>>         at
>> org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:
>>
>> 42)
>>
>>         at
>> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala
>>
>> :202)
>>
>>         at
>> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.sca
>>
>> la)
>>
>>         at
>> org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
>>
>>         at
>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
>>
>>         at
>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
>>
>>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
>>
>>         at 
>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
>>
>>         at 
>> org.apache.spark.SparkContext.<init>(SparkContext.scala:230)
>>
>>         at
>> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
>>
>> scala:38)
>>
>>         at
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh
>>
>> riftServer2.scala:56)
>>
>>         at
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr
>>
>> iftServer2.scala)
>>
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 
>> Method)
>>
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>>
>> java:57)
>>
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
>>
>> sorImpl.java:43)
>>
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)
>>
>>         at 
>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>
>>         at 
>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> Caused by: java.lang.ClassNotFoundException:
>> com.google.common.base.Precondition
>>
>> s
>>
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>
>>         at java.security.AccessController.doPrivileged(Native Method)
>>
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>
>>         at 
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>
>>
>>
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to