[
https://issues.apache.org/jira/browse/SPARK-6539?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
davep updated SPARK-6539:
-------------------------
Description:
The following snippet of the client driver will throw a NullPointerException
when trying to create a SparkContext
{code:java}
SparkConf conf = new SparkConf().setAppName(appName).setMaster("yarn-client");
JavaSparkContext sc = new JavaSparkContext(conf);
{code}
The exception is thrown when trying to create the block manager source here:
https://github.com/apache/spark/blob/branch-1.3/core/src/main/scala/org/apache/spark/SparkContext.scala#L544
If I compile the client driver with JDK7 or JDK8 it throws the same exception.
Changing the JDK on the Hadoop cluster to v7 resolves this issue.
was:
The following snippet of the client driver will throw a NullPointerException
when trying to create a SparkContext
{code:java}
SparkConf conf = new SparkConf().setAppName(appName).setMaster("yarn-client");
JavaSparkContext sc = new JavaSparkContext(conf);
{code}
The exception is thrown when trying to create the block manager source here:
https://github.com/apache/spark/blob/branch-1.3/core/src/main/scala/org/apache/spark/SparkContext.scala#L544
If I compile the client driver with JDK7 or JDK8 it throws the same exception.
Changing the JDK on the Hadoop cluster resolves this issue.
> SparkContext throws NullPointerException
> ----------------------------------------
>
> Key: SPARK-6539
> URL: https://issues.apache.org/jira/browse/SPARK-6539
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.3.0
> Environment: Spark built using the folllowing:
> {code}
> ./make-distribution.sh --tgz -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0
> -DskipTests
> {code}
> Targeting Pivotal HD 2.1 (Hadoop 2.2) cluster with the nodes running JDK 8u40
> Reporter: davep
>
> The following snippet of the client driver will throw a NullPointerException
> when trying to create a SparkContext
> {code:java}
> SparkConf conf = new SparkConf().setAppName(appName).setMaster("yarn-client");
> JavaSparkContext sc = new JavaSparkContext(conf);
> {code}
> The exception is thrown when trying to create the block manager source here:
> https://github.com/apache/spark/blob/branch-1.3/core/src/main/scala/org/apache/spark/SparkContext.scala#L544
> If I compile the client driver with JDK7 or JDK8 it throws the same
> exception. Changing the JDK on the Hadoop cluster to v7 resolves this issue.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]