[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-13 Thread djvulee
Github user djvulee commented on the issue:

https://github.com/apache/spark/pull/16210
  
Yes, this PR do not consider well, I will close this and update the JIRA.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-13 Thread srowen
Github user srowen commented on the issue:

https://github.com/apache/spark/pull/16210
  
OK. This PR doesn't move the handling / get rid of SPARK_SUBMIT_OPTS. 
Should this then be closed and the JIRA updated?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-12 Thread vanzin
Github user vanzin commented on the issue:

https://github.com/apache/spark/pull/16210
  
The bug was filed as an improvement, and I think this is fine as an 
improvement. (By this I mean the suggestion of moving the handling of that 
option to Java code.)

That way we can get rid of `SPARK_SUBMIT_OPTS`, which is not documented and 
only the Spark shell scripts use. Users should be using 
"spark.driver.extraJavaOptions" and friends.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-12 Thread djvulee
Github user djvulee commented on the issue:

https://github.com/apache/spark/pull/16210
  
@srowen Sorry for late reply and thanks for your reproduce!

As I have mentioned in last reply, this is not a  environment problem, but 
a misunderstand of SPARK_SUBMIT_OPTS by ourself, or a deployment  problem. This 
works for anyone because few people use the ```SPARK_SUBMIT_OPTS``` option and 
do not put ```SPARK_SUBMIT_OPTS``` in the spark-env.sh file.

 It maybe better to separate the ```Dscala.usejavacp=true``` from the 
```SPARK_SUBMIT_OPTS``` in the spark-shell file to avoid the misunderstanding.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-10 Thread srowen
Github user srowen commented on the issue:

https://github.com/apache/spark/pull/16210
  
I'm still not clear how to reproduce this. I can't repro with Java 8, 
Debian, on 2.x. Is this really necessary or somehow band-aiding some other 
deployment problem? that is, why does this work for anyone at all?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-08 Thread djvulee
Github user djvulee commented on the issue:

https://github.com/apache/spark/pull/16210
  
I find the reason, because we pass some  SPARK_SUBMIT_OPTS defined by 
ourself, so it seem that spark only parse the opts defined by ourself, ignore 
the ```-Dscala.usejavacp=true```. 

Since we want user to use the `SPARK_SUBMIT_OPTS`, the best way it to 
separate the ```-Dscala.usejavacp=true```  from the SPARK_SUBMIT_OPTS, maybe 
move to SparkSubmitCommandBuilder is good idea as suggestioned by @vanzin.






---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-08 Thread djvulee
Github user djvulee commented on the issue:

https://github.com/apache/spark/pull/16210
  
@jodersky Yes. I try different ways, here is the result:

```
SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Dscala.usejavacp=true -usejavacp"
```

and

```
SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Dscala.usejavacp=true -Dusejavacp"
```

will output
```
Exception in thread "main" java.lang.AssertionError: assertion failed: null
at scala.Predef$.assert(Predef.scala:179)
at 
org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:247)
at 
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990)
at 
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at 
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at 
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at 
org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
```

```
SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -usejavacp"
```
will output:
```
Unrecognized option: -usejavacp
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
```




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-08 Thread jodersky
Github user jodersky commented on the issue:

https://github.com/apache/spark/pull/16210
  
+1 to implementing it in the command builder. However I'm not sure if the 
command option `-usejavacp` or property `scala.usejavacp` is the suggested way 
to add the java class path to the REPL. I skimmed through the scala compiler's 
code and it seems both do slightly different things. I'll check this out in 
further detail


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-08 Thread vanzin
Github user vanzin commented on the issue:

https://github.com/apache/spark/pull/16210
  
I wonder if it wouldn't be better to avoid this by moving the logic to add 
this parameter to `SparkSubmitCommandBuilder`. As in, if the class is the 
REPL's main class, add that option to the JVM.

(That also avoids the duplicate logic in Unix and Windows scripts.)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-08 Thread jodersky
Github user jodersky commented on the issue:

https://github.com/apache/spark/pull/16210
  
The command line parameter `-Dscala.usejavacp` is something directly passed 
to the JVM and should not be OS-dependent. I'm wondering if the issue may not 
arise from the way the parameter is defined in the spark-shell launcher 
scripts. Debian Etch is very old and probably has a older version of bash.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-07 Thread djvulee
Github user djvulee commented on the issue:

https://github.com/apache/spark/pull/16210
  
@rxin our jdk is jdk1.8.0_91, and we do not install the scala, the OS is 
Debian 4.6.4.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-07 Thread srowen
Github user srowen commented on the issue:

https://github.com/apache/spark/pull/16210
  
Paging @jodersky 
I'm not sure about this ... isn't `-usejavacp` a legacy option? I don't 
know of any other reports of this so it is possibly specific to your env.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-07 Thread rxin
Github user rxin commented on the issue:

https://github.com/apache/spark/pull/16210
  
What are the environments?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16210: [Core][SPARK-18778]Fix the scala classpath under some en...

2016-12-07 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/16210
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org