[ 
https://issues.apache.org/jira/browse/SPARK-984?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Josh Rosen updated SPARK-984:
-----------------------------
    Assignee:     (was: Josh Rosen)

> SPARK_TOOLS_JAR not set if multiple tools jars exists
> -----------------------------------------------------
>
>                 Key: SPARK-984
>                 URL: https://issues.apache.org/jira/browse/SPARK-984
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 0.8.1, 0.9.0
>            Reporter: Aaron Davidson
>            Priority: Minor
>
> If you have multiple tools assemblies (e.g., if you assembled on 0.8.1 and 
> 0.9.0 before, for instance), then this error is thrown in spark-class:
> {noformat}./spark-class: line 115: [: 
> /home/aaron/spark/tools/target/scala-2.9.3/spark-tools-assembly-0.8.1-incubating-SNAPSHOT.jar:
>  binary operator expected{noformat}
> This is because of a flaw in the bash script:
> {noformat}if [ -e 
> "$TOOLS_DIR"/target/scala-$SCALA_VERSION/*assembly*[0-9Tg].jar ]; 
> then{noformat}
> which does not parse correctly if the path resolves to multiple files.
> The error is non-fatal, but a nuisance and presumably breaks whatever 
> SPARK_TOOLS_JAR is used for.
> Currently, we error if multiple Spark assemblies are found, so we could do 
> something similar for tools assemblies. The only issue is that means that the 
> user will always have to go through both errors (clean the assembly/ jars 
> then tools/ jar) when it appears that the tools/ jar is not actually 
> important for normal operation. The second possibility is to infer the 
> correct tools jar using the single available assembly jar, but this is 
> slightly complicated by the code path if $FWDIR/RELEASE exists.
> Since I'm not 100% on what SPARK_TOOLS_JAR is even for, I'm assigning this to 
> Josh who wrote the code initially.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to