[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2016-09-01 Thread tone (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15455154#comment-15455154
 ] 

tone commented on SPARK-8368:
-

The comment has been inserted automatically. The PR is NOT related with the 
issue [SPARK-8368]. 
Please ignore the comment here.
Sorry for the inconvenience. 
Thanks!

> ClassNotFoundException in closure for map 
> --
>
> Key: SPARK-8368
> URL: https://issues.apache.org/jira/browse/SPARK-8368
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
> Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
> project on Windows 7 and run in a spark standalone cluster(or local) mode on 
> Centos 6.X. 
>Reporter: CHEN Zhiwei
>Assignee: Yin Huai
>Priority: Blocker
> Fix For: 1.4.1, 1.5.0
>
>
> After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
> following exception:
> ==begin exception
> {quote}
> Exception in thread "main" java.lang.ClassNotFoundException: 
> com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
>   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>   at java.lang.Class.forName0(Native Method)
>   at java.lang.Class.forName(Class.java:278)
>   at 
> org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
>   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
>   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
>   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
>   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
>   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
>   at com.yhd.ycache.magic.Model.main(SSExample.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>   at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {quote}
> ===end exception===
> I simplify the code that cause this issue, as following:
> ==begin code==
> {noformat}
> object Model extends Serializable{
>   def main(args: Array[String]) {
> val Array(sql) = args
> val sparkConf = new SparkConf().setAppName("Mode Example")
> val sc = new SparkContext(sparkConf)
> val hive = new HiveContext(sc)
> //get data by hive sql
> val rows = hive.sql(sql)
> val data = rows.map(r => { 
>   val arr = r.toSeq.toArray
>   val label = 1.0
>   def fmap = ( input: Any ) => 1.0
>   val feature = arr.map(_=>1.0)
>   LabeledPoint(label, Vectors.dense(feature))
> })
> data.count()
>   }
> }
> {noformat}
> =end code===
> This code can run pretty well on spark-shell, but error when submit it to 
> spark cluster (standalone or local mode).  I try the same code on spark 
> 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2016-08-31 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15451869#comment-15451869
 ] 

Apache Spark commented on SPARK-8368:
-

User 'tone-zhang' has created a pull request for this issue:
https://github.com/apache/spark/pull/14894

> ClassNotFoundException in closure for map 
> --
>
> Key: SPARK-8368
> URL: https://issues.apache.org/jira/browse/SPARK-8368
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
> Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
> project on Windows 7 and run in a spark standalone cluster(or local) mode on 
> Centos 6.X. 
>Reporter: CHEN Zhiwei
>Assignee: Yin Huai
>Priority: Blocker
> Fix For: 1.4.1, 1.5.0
>
>
> After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
> following exception:
> ==begin exception
> {quote}
> Exception in thread "main" java.lang.ClassNotFoundException: 
> com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
>   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>   at java.lang.Class.forName0(Native Method)
>   at java.lang.Class.forName(Class.java:278)
>   at 
> org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
>   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
>   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
>   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
>   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
>   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
>   at com.yhd.ycache.magic.Model.main(SSExample.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>   at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {quote}
> ===end exception===
> I simplify the code that cause this issue, as following:
> ==begin code==
> {noformat}
> object Model extends Serializable{
>   def main(args: Array[String]) {
> val Array(sql) = args
> val sparkConf = new SparkConf().setAppName("Mode Example")
> val sc = new SparkContext(sparkConf)
> val hive = new HiveContext(sc)
> //get data by hive sql
> val rows = hive.sql(sql)
> val data = rows.map(r => { 
>   val arr = r.toSeq.toArray
>   val label = 1.0
>   def fmap = ( input: Any ) => 1.0
>   val feature = arr.map(_=>1.0)
>   LabeledPoint(label, Vectors.dense(feature))
> })
> data.count()
>   }
> }
> {noformat}
> =end code===
> This code can run pretty well on spark-shell, but error when submit it to 
> spark cluster (standalone or local mode).  I try the same code on spark 
> 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To 

[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-22 Thread Don Drake (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14596001#comment-14596001
 ] 

Don Drake commented on SPARK-8368:
--

I've verified through a nightly build that this resolves my issue (SPARK-8365). 
 Thanks!

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei
Assignee: Yin Huai
Priority: Blocker
 Fix For: 1.4.1, 1.5.0


 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: 

[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-22 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14596016#comment-14596016
 ] 

Yin Huai commented on SPARK-8368:
-

[~dondrake] Great! Thanks for checking it.

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei
Assignee: Yin Huai
Priority: Blocker
 Fix For: 1.4.1, 1.5.0


 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-18 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14592965#comment-14592965
 ] 

Apache Spark commented on SPARK-8368:
-

User 'yhuai' has created a pull request for this issue:
https://github.com/apache/spark/pull/6895

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei
Assignee: Yin Huai
Priority: Blocker

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-18 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14592753#comment-14592753
 ] 

Apache Spark commented on SPARK-8368:
-

User 'yhuai' has created a pull request for this issue:
https://github.com/apache/spark/pull/6891

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei
Assignee: Yin Huai
Priority: Blocker

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-17 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14590731#comment-14590731
 ] 

Yin Huai commented on SPARK-8368:
-

Right now, looks like it is a problem caused by spark sql's isolated class 
loader.

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei
Priority: Blocker

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-17 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14590636#comment-14590636
 ] 

Yin Huai commented on SPARK-8368:
-

I have reproduced it. I am investigating it now.

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-17 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14591154#comment-14591154
 ] 

Yin Huai commented on SPARK-8368:
-

[~zwChan] I have found the cause. Will have a fix soon. 

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei
Priority: Blocker

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-17 Thread CHEN Zhiwei (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14591129#comment-14591129
 ] 

CHEN Zhiwei commented on SPARK-8368:


Great! 
I am not familiar to the class loader, and hope for good news from you. If 
there is anything I can help, please tell me without hesitation.

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei
Priority: Blocker

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: 

[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-17 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14591171#comment-14591171
 ] 

Yin Huai commented on SPARK-8368:
-

The cause of this problem is 
{{SessionState.setCurrentSessionState(executionHive.state)}} called during the 
construction of {{HiveContext}} will override the context class loader of the 
current thread by the class loader of associated with {{executionHive.state}}. 
We need to correctly set the class loader of {{executionHive.state}}.

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei
Priority: Blocker

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-17 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14590072#comment-14590072
 ] 

Yin Huai commented on SPARK-8368:
-

Can you add {{--verbose}} and post the extra information like {{Main class:}} 
and {{Classpath elements:}}? 

[~andrewor14] Have we changed any thing related to spark submit in 1.4.0?

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: 

[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-16 Thread CHEN Zhiwei (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14587686#comment-14587686
 ] 

CHEN Zhiwei commented on SPARK-8368:


I have tried both local and standalone mode, and both with-dependency and 
without-dependency jar file. Use the following command:
 
spark-submit --class com.yhd.ycache.magic.Model --jars 
./SSExample-0.0.1-SNAPSHOT-jar-with-dependencies.jar  --master local 
./SSExample-0.0.1-SNAPSHOT-jar-with-dependencies.jar sql statement 

BTW, the same code and submit command run on spark 1.3.0 without problem.

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This 

[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-16 Thread CHEN Zhiwei (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14587685#comment-14587685
 ] 

CHEN Zhiwei commented on SPARK-8368:


I have tried both local and standalone mode, and both with-dependency and 
without-dependency jar file. Use the following command:
 
spark-submit --class com.yhd.ycache.magic.Model --jars 
./SSExample-0.0.1-SNAPSHOT-jar-with-dependencies.jar  --master local 
./SSExample-0.0.1-SNAPSHOT-jar-with-dependencies.jar sql statement 

BTW, the same code and submit command run on spark 1.3.0 without problem.

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This 

[jira] [Commented] (SPARK-8368) ClassNotFoundException in closure for map

2015-06-15 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-8368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14587459#comment-14587459
 ] 

Yin Huai commented on SPARK-8368:
-

@CHEN Zhiwei How was the application submitted?

 ClassNotFoundException in closure for map 
 --

 Key: SPARK-8368
 URL: https://issues.apache.org/jira/browse/SPARK-8368
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.4.0
 Environment: Centos 6.5, java 1.7.0_67, scala 2.10.4. Build the 
 project on Windows 7 and run in a spark standalone cluster(or local) mode on 
 Centos 6.X. 
Reporter: CHEN Zhiwei

 After upgraded the cluster from spark 1.3.0 to 1.4.0(rc4), I encountered the 
 following exception:
 ==begin exception
 {quote}
 Exception in thread main java.lang.ClassNotFoundException: 
 com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:278)
   at 
 org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:197)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:294)
   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:293)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
   at 
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
   at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
   at org.apache.spark.rdd.RDD.map(RDD.scala:293)
   at org.apache.spark.sql.DataFrame.map(DataFrame.scala:1210)
   at com.yhd.ycache.magic.Model$.main(SSExample.scala:239)
   at com.yhd.ycache.magic.Model.main(SSExample.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
   at 
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 {quote}
 ===end exception===
 I simplify the code that cause this issue, as following:
 ==begin code==
 {noformat}
 object Model extends Serializable{
   def main(args: Array[String]) {
 val Array(sql) = args
 val sparkConf = new SparkConf().setAppName(Mode Example)
 val sc = new SparkContext(sparkConf)
 val hive = new HiveContext(sc)
 //get data by hive sql
 val rows = hive.sql(sql)
 val data = rows.map(r = { 
   val arr = r.toSeq.toArray
   val label = 1.0
   def fmap = ( input: Any ) = 1.0
   val feature = arr.map(_=1.0)
   LabeledPoint(label, Vectors.dense(feature))
 })
 data.count()
   }
 }
 {noformat}
 =end code===
 This code can run pretty well on spark-shell, but error when submit it to 
 spark cluster (standalone or local mode).  I try the same code on spark 
 1.3.0(local mode), and no exception is encountered.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org