Exception initializing JavaSparkContext

2015-09-21 Thread ekraffmiller
Hi,
I’m trying to run a simple test program to access Spark though Java.  I’m
using JDK 1.8, and Spark 1.5.  I’m getting an Exception from the
JavaSparkContext constructor.  My initialization code matches all the sample
code I’ve found online, so not sure what I’m doing wrong.

Here is my code:

SparkConf conf = new SparkConf().setAppName("Simple Application");
conf.setMaster("local");
conf.setAppName("my app");
JavaSparkContext sc = new JavaSparkContext(conf);

The stack trace of the Exception:

java.lang.ExceptionInInitializerError: null
at java.lang.Class.getField(Class.java:1690)
at
org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:220)
at
org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
at
org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
at
org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:189)
at
org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala:58)
at
org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala)
at
org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:147)
at
org.apache.spark.storage.DiskBlockManager.(DiskBlockManager.scala:54)
at org.apache.spark.storage.BlockManager.(BlockManager.scala:75)
at org.apache.spark.storage.BlockManager.(BlockManager.scala:173)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:345)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:276)
at org.apache.spark.SparkContext.(SparkContext.scala:441)
at
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
at
edu.harvard.iq.text.core.spark.SparkControllerTest.testMongoRDD(SparkControllerTest.java:63)

Thanks,
Ellen



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Exception-initializing-JavaSparkContext-tp24755.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Exception initializing JavaSparkContext

2015-09-21 Thread Marcelo Vanzin
What Spark package are you using? In particular, which hadoop version?

On Mon, Sep 21, 2015 at 9:14 AM, ekraffmiller
<ellen.kraffmil...@gmail.com> wrote:
> Hi,
> I’m trying to run a simple test program to access Spark though Java.  I’m
> using JDK 1.8, and Spark 1.5.  I’m getting an Exception from the
> JavaSparkContext constructor.  My initialization code matches all the sample
> code I’ve found online, so not sure what I’m doing wrong.
>
> Here is my code:
>
> SparkConf conf = new SparkConf().setAppName("Simple Application");
> conf.setMaster("local");
> conf.setAppName("my app");
> JavaSparkContext sc = new JavaSparkContext(conf);
>
> The stack trace of the Exception:
>
> java.lang.ExceptionInInitializerError: null
> at java.lang.Class.getField(Class.java:1690)
> at
> org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:220)
> at
> org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
> at
> org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
> at
> org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:189)
> at
> org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala:58)
> at
> org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala)
> at
> org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:147)
> at
> org.apache.spark.storage.DiskBlockManager.(DiskBlockManager.scala:54)
> at org.apache.spark.storage.BlockManager.(BlockManager.scala:75)
> at 
> org.apache.spark.storage.BlockManager.(BlockManager.scala:173)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:345)
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
> at 
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:276)
> at org.apache.spark.SparkContext.(SparkContext.scala:441)
> at
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
> at
> edu.harvard.iq.text.core.spark.SparkControllerTest.testMongoRDD(SparkControllerTest.java:63)
>
> Thanks,
> Ellen
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-initializing-JavaSparkContext-tp24755.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Exception initializing JavaSparkContext

2015-09-21 Thread Ellen Kraffmiller
I found the problem - the pom.xml I was using also contained and old
dependency to a mahout library, which was including the old hadoop-core.
Removing that fixed the problem.
Thank you!

On Mon, Sep 21, 2015 at 2:54 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> bq. hadoop-core-0.20.204.0
>
> How come the above got into play - it was from hadoop-1
>
> On Mon, Sep 21, 2015 at 11:34 AM, Ellen Kraffmiller <
> ellen.kraffmil...@gmail.com> wrote:
>
>> I am including the Spark core dependency in my maven pom.xml:
>>
>> 
>> org.apache.spark
>> spark-core_2.10
>> 1.5.0
>> 
>>
>> This is bringing these hadoop versions:
>> hadoop-annotations-2.2.0
>> hadoop-auth-2.2.0
>> hadoop-client-2.2.0
>> hadoop-common-2.2.0
>> hadoop-core-0.20.204.0
>> hadoop-hdfs-2.2.0
>> followed by mapreduce and yarn dependencies... let me know if you need
>> the full list.
>> Thanks,
>> Ellen
>>
>>
>> On Mon, Sep 21, 2015 at 1:48 PM, Marcelo Vanzin <van...@cloudera.com>
>> wrote:
>>
>>> What Spark package are you using? In particular, which hadoop version?
>>>
>>> On Mon, Sep 21, 2015 at 9:14 AM, ekraffmiller
>>> <ellen.kraffmil...@gmail.com> wrote:
>>> > Hi,
>>> > I’m trying to run a simple test program to access Spark though Java.
>>> I’m
>>> > using JDK 1.8, and Spark 1.5.  I’m getting an Exception from the
>>> > JavaSparkContext constructor.  My initialization code matches all the
>>> sample
>>> > code I’ve found online, so not sure what I’m doing wrong.
>>> >
>>> > Here is my code:
>>> >
>>> > SparkConf conf = new SparkConf().setAppName("Simple Application");
>>> > conf.setMaster("local");
>>> > conf.setAppName("my app");
>>> > JavaSparkContext sc = new JavaSparkContext(conf);
>>> >
>>> > The stack trace of the Exception:
>>> >
>>> > java.lang.ExceptionInInitializerError: null
>>> > at java.lang.Class.getField(Class.java:1690)
>>> > at
>>> >
>>> org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:220)
>>> > at
>>> >
>>> org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
>>> > at
>>> >
>>> org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
>>> > at
>>> >
>>> org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:189)
>>> > at
>>> >
>>> org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala:58)
>>> > at
>>> >
>>> org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala)
>>> > at
>>> >
>>> org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:147)
>>> > at
>>> >
>>> org.apache.spark.storage.DiskBlockManager.(DiskBlockManager.scala:54)
>>> > at
>>> org.apache.spark.storage.BlockManager.(BlockManager.scala:75)
>>> > at
>>> org.apache.spark.storage.BlockManager.(BlockManager.scala:173)
>>> > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:345)
>>> > at
>>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
>>> > at
>>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:276)
>>> > at org.apache.spark.SparkContext.(SparkContext.scala:441)
>>> > at
>>> >
>>> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
>>> > at
>>> >
>>> edu.harvard.iq.text.core.spark.SparkControllerTest.testMongoRDD(SparkControllerTest.java:63)
>>> >
>>> > Thanks,
>>> > Ellen
>>> >
>>> >
>>> >
>>> > --
>>> > View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-initializing-JavaSparkContext-tp24755.html
>>> > Sent from the Apache Spark User List mailing list archive at
>>> Nabble.com.
>>> >
>>> > -
>>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> > For additional commands, e-mail: user-h...@spark.apache.org
>>> >
>>>
>>>
>>>
>>> --
>>> Marcelo
>>>
>>
>>
>


Re: Exception initializing JavaSparkContext

2015-09-21 Thread Ted Yu
bq. hadoop-core-0.20.204.0

How come the above got into play - it was from hadoop-1

On Mon, Sep 21, 2015 at 11:34 AM, Ellen Kraffmiller <
ellen.kraffmil...@gmail.com> wrote:

> I am including the Spark core dependency in my maven pom.xml:
>
> 
> org.apache.spark
> spark-core_2.10
> 1.5.0
> 
>
> This is bringing these hadoop versions:
> hadoop-annotations-2.2.0
> hadoop-auth-2.2.0
> hadoop-client-2.2.0
> hadoop-common-2.2.0
> hadoop-core-0.20.204.0
> hadoop-hdfs-2.2.0
> followed by mapreduce and yarn dependencies... let me know if you need the
> full list.
> Thanks,
> Ellen
>
>
> On Mon, Sep 21, 2015 at 1:48 PM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
>
>> What Spark package are you using? In particular, which hadoop version?
>>
>> On Mon, Sep 21, 2015 at 9:14 AM, ekraffmiller
>> <ellen.kraffmil...@gmail.com> wrote:
>> > Hi,
>> > I’m trying to run a simple test program to access Spark though Java.
>> I’m
>> > using JDK 1.8, and Spark 1.5.  I’m getting an Exception from the
>> > JavaSparkContext constructor.  My initialization code matches all the
>> sample
>> > code I’ve found online, so not sure what I’m doing wrong.
>> >
>> > Here is my code:
>> >
>> > SparkConf conf = new SparkConf().setAppName("Simple Application");
>> > conf.setMaster("local");
>> > conf.setAppName("my app");
>> > JavaSparkContext sc = new JavaSparkContext(conf);
>> >
>> > The stack trace of the Exception:
>> >
>> > java.lang.ExceptionInInitializerError: null
>> > at java.lang.Class.getField(Class.java:1690)
>> > at
>> >
>> org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:220)
>> > at
>> >
>> org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
>> > at
>> >
>> org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
>> > at
>> >
>> org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:189)
>> > at
>> >
>> org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala:58)
>> > at
>> >
>> org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala)
>> > at
>> >
>> org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:147)
>> > at
>> >
>> org.apache.spark.storage.DiskBlockManager.(DiskBlockManager.scala:54)
>> > at
>> org.apache.spark.storage.BlockManager.(BlockManager.scala:75)
>> > at
>> org.apache.spark.storage.BlockManager.(BlockManager.scala:173)
>> > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:345)
>> > at
>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
>> > at
>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:276)
>> > at org.apache.spark.SparkContext.(SparkContext.scala:441)
>> > at
>> >
>> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
>> > at
>> >
>> edu.harvard.iq.text.core.spark.SparkControllerTest.testMongoRDD(SparkControllerTest.java:63)
>> >
>> > Thanks,
>> > Ellen
>> >
>> >
>> >
>> > --
>> > View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-initializing-JavaSparkContext-tp24755.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> >
>> > -
>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: user-h...@spark.apache.org
>> >
>>
>>
>>
>> --
>> Marcelo
>>
>
>