[ 
https://issues.apache.org/jira/browse/SPARK-17523?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-17523:
------------------------------------

    Assignee: Apache Spark

> Cannot get Spark build info from spark-core package which built in Windows
> --------------------------------------------------------------------------
>
>                 Key: SPARK-17523
>                 URL: https://issues.apache.org/jira/browse/SPARK-17523
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, Spark Core
>    Affects Versions: 2.0.0
>            Reporter: Yun Tang
>            Assignee: Apache Spark
>              Labels: windows
>             Fix For: 2.0.1
>
>
> Currently, if we build Spark, it will generate a 
> 'spark-version-info.properties' and merged into spark-core_2.11-*.jar. 
> However, the script 'build/spark-build-info' which generates this file can 
> only be executed with bash environment.
> Without this file, errors like below will happen when submitting Spark 
> application, which break the whole submitting phrase at beginning.
> {code:java}
> ERROR ApplicationMaster: Uncaught exception: 
> org.apache.spark.SparkException: Exception thrown in awaitResult: 
>       at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:194)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:394)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:247)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:759)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:757)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
> Caused by: java.util.concurrent.ExecutionException: Boxed Error
>       at scala.concurrent.impl.Promise$.resolver(Promise.scala:55)
>       at 
> scala.concurrent.impl.Promise$.scala$concurrent$impl$Promise$$resolveTry(Promise.scala:47)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:244)
>       at scala.concurrent.Promise$class.tryFailure(Promise.scala:112)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:153)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:648)
> Caused by: java.lang.ExceptionInInitializerError
>       at org.apache.spark.package$.<init>(package.scala:91)
>       at org.apache.spark.package$.<clinit>(package.scala)
>       at 
> org.apache.spark.SparkContext$$anonfun$3.apply(SparkContext.scala:187)
>       at 
> org.apache.spark.SparkContext$$anonfun$3.apply(SparkContext.scala:187)
>       at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
>       at org.apache.spark.SparkContext.logInfo(SparkContext.scala:76)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:187)
>       at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2287)
>       at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:822)
>       at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:814)
>       at scala.Option.getOrElse(Option.scala:121)
>       at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:814)
>       at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
>       at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:630)
> Caused by: org.apache.spark.SparkException: Error while locating file 
> spark-version-info.properties
>       at 
> org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:75)
>       at org.apache.spark.package$SparkBuildInfo$.<init>(package.scala:61)
>       at org.apache.spark.package$SparkBuildInfo$.<clinit>(package.scala)
>       ... 19 more
> Caused by: java.lang.NullPointerException
>       at java.util.Properties$LineReader.readLine(Properties.java:434)
>       at java.util.Properties.load0(Properties.java:353)
>       at java.util.Properties.load(Properties.java:341)
>       at 
> org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:64)
>       ... 21 more
> {code}
> We need to provide method to generate  'spark-version-info.properties' in 
> Windows environment.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to