[ 
https://issues.apache.org/jira/browse/SPARK-4831?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14242778#comment-14242778
 ] 

Sean Owen commented on SPARK-4831:
----------------------------------

Hm, so I made a quick test, where I put a class {{Foo.class}} inside 
{{Foo.jar}} and then ran {{java -cp :otherstuff.jar Foo}}. It does not find the 
class, which suggests to me that it does not interpret that empty entry as 
meaning "local directory too". 

It doesn't work even if I put "." on the classpath. That makes sense. The 
working directory contains JARs, in your case, not classes.

However it finds it if I leave {{Foo.class}} in the working directory, *if* I 
have an empty entry in the classpath. Is it perhaps finding and exploded 
directory of classes? Otherwise, I can't repro this directly I suppose, in Java.

> Current directory always on classpath with spark-submit
> -------------------------------------------------------
>
>                 Key: SPARK-4831
>                 URL: https://issues.apache.org/jira/browse/SPARK-4831
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.1.1, 1.2.0
>            Reporter: Daniel Darabos
>            Priority: Minor
>
> We had a situation where we were launching an application with spark-submit, 
> and a file (play.plugins) was on the classpath twice, causing problems 
> (trying to register plugins twice). Upon investigating how it got on the 
> classpath twice, we found that it was present in one of our jars, and also in 
> the current working directory. But the one in the current working directory 
> should not be on the classpath. We never asked spark-submit to put the 
> current directory on the classpath.
> I think this is caused by a line in 
> [compute-classpath.sh|https://github.com/apache/spark/blob/v1.2.0-rc2/bin/compute-classpath.sh#L28]:
> {code}
> CLASSPATH="$SPARK_CLASSPATH:$SPARK_SUBMIT_CLASSPATH"
> {code}
> Now if SPARK_CLASSPATH is empty, the empty string is added to the classpath, 
> which means the current working directory.
> We tried setting SPARK_CLASSPATH to a bogus value, but that is [not 
> allowed|https://github.com/apache/spark/blob/v1.2.0-rc2/core/src/main/scala/org/apache/spark/SparkConf.scala#L312].
> What is the right solution? Only add SPARK_CLASSPATH if it's non-empty? I can 
> send a pull request for that I think. Thanks!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to