[ https://issues.apache.org/jira/browse/SPARK-1920?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Thomas Graves resolved SPARK-1920. ---------------------------------- Resolution: Duplicate > Spark JAR compiled with Java 7 leads to PySpark not working in YARN > ------------------------------------------------------------------- > > Key: SPARK-1920 > URL: https://issues.apache.org/jira/browse/SPARK-1920 > Project: Spark > Issue Type: Bug > Components: PySpark, YARN > Affects Versions: 1.0.0 > Reporter: Tathagata Das > Priority: Blocker > > Current (Spark 1.0) implementation of PySpark on Yarn requires python to be > able to read Spark assembly JAR. But Spark assembly JAR compiled with Java 7 > can sometimes be not readable by python. This can be due to the fact that > JARs created by Java 7 with more 2^16 files is encoded in Zip64, which python > cant read. > [SPARK-1911|https://issues.apache.org/jira/browse/SPARK-1911] warns users > from using Java 7 when creating Spark distribution. > One way to fix this is to put pyspark in a different smaller JAR than rest of > Spark so that it is readable by python. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org