[jira] [Assigned] (SPARK-7489) Spark shell crashes when compiled with scala 2.11 and SPARK_PREPEND_CLASSES=true

2015-05-08 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-7489?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-7489:
---

Assignee: (was: Apache Spark)

 Spark shell crashes when compiled with scala 2.11 and 
 SPARK_PREPEND_CLASSES=true
 

 Key: SPARK-7489
 URL: https://issues.apache.org/jira/browse/SPARK-7489
 Project: Spark
  Issue Type: Bug
  Components: Spark Shell
Reporter: Vinod KC

 Steps followed
 export SPARK_PREPEND_CLASSES=true
 dev/change-version-to-2.11.sh
  sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean assembly
 bin/spark-shell
 
 15/05/08 22:31:35 INFO Main: Created spark context..
 Spark context available as sc.
 java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
   at java.lang.Class.getDeclaredConstructors0(Native Method)
   at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
   at java.lang.Class.getConstructor0(Class.java:3075)
   at java.lang.Class.getConstructor(Class.java:1825)
   at org.apache.spark.repl.Main$.createSQLContext(Main.scala:86)
   ... 45 elided
 Caused by: java.lang.ClassNotFoundException: 
 org.apache.hadoop.hive.conf.HiveConf
   at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
   ... 50 more
 console:11: error: not found: value sqlContext
import sqlContext.implicits._
   ^
 console:11: error: not found: value sqlContext
import sqlContext.sql
 There is a similar Resolved JIRA issue  -SPARK-7470 and a PR 
 https://github.com/apache/spark/pull/5997 , which handled same  issue  only 
 in scala 2.10



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-7489) Spark shell crashes when compiled with scala 2.11 and SPARK_PREPEND_CLASSES=true

2015-05-08 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-7489?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-7489:
---

Assignee: Apache Spark

 Spark shell crashes when compiled with scala 2.11 and 
 SPARK_PREPEND_CLASSES=true
 

 Key: SPARK-7489
 URL: https://issues.apache.org/jira/browse/SPARK-7489
 Project: Spark
  Issue Type: Bug
  Components: Spark Shell
Reporter: Vinod KC
Assignee: Apache Spark

 Steps followed
 export SPARK_PREPEND_CLASSES=true
 dev/change-version-to-2.11.sh
  sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean assembly
 bin/spark-shell
 
 15/05/08 22:31:35 INFO Main: Created spark context..
 Spark context available as sc.
 java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
   at java.lang.Class.getDeclaredConstructors0(Native Method)
   at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
   at java.lang.Class.getConstructor0(Class.java:3075)
   at java.lang.Class.getConstructor(Class.java:1825)
   at org.apache.spark.repl.Main$.createSQLContext(Main.scala:86)
   ... 45 elided
 Caused by: java.lang.ClassNotFoundException: 
 org.apache.hadoop.hive.conf.HiveConf
   at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
   ... 50 more
 console:11: error: not found: value sqlContext
import sqlContext.implicits._
   ^
 console:11: error: not found: value sqlContext
import sqlContext.sql
 There is a similar Resolved JIRA issue  -SPARK-7470 and a PR 
 https://github.com/apache/spark/pull/5997 , which handled same  issue  only 
 in scala 2.10



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org