[ https://issues.apache.org/jira/browse/SPARK-11798?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15084423#comment-15084423 ]
Jeff Zhang commented on SPARK-11798: ------------------------------------ Yes, I am sure enabled that flag. The weird thing is that I make another clone on the same machine and could find the datanucleus jars by using same command. Might be some environment issue. > Datanucleus jars is missing under lib_managed/jars > -------------------------------------------------- > > Key: SPARK-11798 > URL: https://issues.apache.org/jira/browse/SPARK-11798 > Project: Spark > Issue Type: Bug > Components: Build, SQL > Reporter: Jeff Zhang > > I notice the comments in https://github.com/apache/spark/pull/9575 said that > Datanucleus related jars will still be copied to lib_managed/jars. But I > don't see any jars under lib_managed/jars. The weird thing is that I see the > jars on another machine, but could not see jars on my laptop even after I > delete the whole spark project and start from scratch. Does it related with > environments ? I try to add the following code in SparkBuild.scala to track > the issue, it shows that the jars is empty. > {code} > deployDatanucleusJars := { > val jars: Seq[File] = (fullClasspath in assembly).value.map(_.data) > .filter(_.getPath.contains("org.datanucleus")) > // this is what I added > println("*********************************************") > println("fullClasspath:"+fullClasspath) > println("assembly:"+assembly) > println("jars:"+jars.map(_.getAbsolutePath()).mkString(",")) > // > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org