[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-06-08 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/20809
  
Can one of the admins verify this patch?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-05-21 Thread vanzin
Github user vanzin commented on the issue:

https://github.com/apache/spark/pull/20809
  
I don't understand your reply. The testing stuff should only be true during 
*Spark* unit tests. You shouldn't be setting that in your tests because you're 
not testing Spark.

If you are, you should fix your testing infrastructure to not do that.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-05-20 Thread gczsjdy
Github user gczsjdy commented on the issue:

https://github.com/apache/spark/pull/20809
  
@vanzin Sorry for the late reply. According to the call stack, it's the 
first place called `getScalaVersion`, `isTest` is true so we can go into that 
path.
This happens in travis. 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-05-13 Thread gczsjdy
Github user gczsjdy commented on the issue:

https://github.com/apache/spark/pull/20809
  
@vanzin Sorry but I will update it in next week, thanks.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-05-11 Thread vanzin
Github user vanzin commented on the issue:

https://github.com/apache/spark/pull/20809
  
Do you plan to update this PR? Otherwise it should be closed.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-03-19 Thread vanzin
Github user vanzin commented on the issue:

https://github.com/apache/spark/pull/20809
  
> This is on travis and no SPARK_HOME is set.

That sounds a little odd. If that is true, then your proposed code wouldn't 
work either, since it requires SPARK_HOME to be known.

In any case, there are two calls to `getScalaVersion()`.

First is:
```
boolean prependClasses = !isEmpty(getenv("SPARK_PREPEND_CLASSES"));
boolean isTesting = "1".equals(getenv("SPARK_TESTING"));
if (prependClasses || isTesting) {
  String scala = getScalaVersion();
```

And your code shouldn't be triggering that, since both env variables are 
for Spark development and other applications shouldn't be using them.

Second call is a little later:
```
String jarsDir = findJarsDir(getSparkHome(), getScalaVersion(), 
!isTesting && !isTestingSql);
```

Here `getScalaVersion()` is only needed when running Spark from a git 
clone, not from the distribution package. So the right thing would be to move 
`getScalaVersion()` to `CommandBuilderUtils`, and call it from `findJarsDir` 
only if needed.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-03-19 Thread gczsjdy
Github user gczsjdy commented on the issue:

https://github.com/apache/spark/pull/20809
  
@vanzin Thanks. : )
I am testing using [OAP](https://github.com/Intel-bigdata/OAP) with 
pre-built Spark on `LocalClusterMode`.
This is on travis and no SPARK_HOME is set.
The `mvn test` command will produce this error:

`23:49:56.997 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error 
running executor
java.lang.IllegalStateException: Cannot find any build directories.
at 
org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
at 
org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:241)
at 
org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:147)
at 
org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:118)
at 
org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
at 
org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:47)
at 
org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:63)
at 
org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:51)
at 
org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:145)
at 
org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)`


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-03-16 Thread vanzin
Github user vanzin commented on the issue:

https://github.com/apache/spark/pull/20809
  
Can you provide more information in the bug report? e.g. a sample 
application and a sample error.

I don't think this is the correct change, but without your use case I'm not 
sure what the right change would be.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-03-16 Thread gczsjdy
Github user gczsjdy commented on the issue:

https://github.com/apache/spark/pull/20809
  
@viirya Yes, but this is only for people who will investigate on Spark 
code, and it also requires manual efforts. Isn't it better if we get this 
automatically?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-03-16 Thread viirya
Github user viirya commented on the issue:

https://github.com/apache/spark/pull/20809
  
For the case, shouldn't we just set `SPARK_SCALA_VERSION`?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20809: [SPARK-23667][CORE] Better scala version check

2018-03-15 Thread gczsjdy
Github user gczsjdy commented on the issue:

https://github.com/apache/spark/pull/20809
  
cc @cloud-fan @viirya 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org