Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/20809
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/20809
I don't understand your reply. The testing stuff should only be true during
*Spark* unit tests. You shouldn't be setting that in your tests because you're
not testing Spark.
If you are, you
Github user gczsjdy commented on the issue:
https://github.com/apache/spark/pull/20809
@vanzin Sorry for the late reply. According to the call stack, it's the
first place called `getScalaVersion`, `isTest` is true so we can go into that
path.
This happens in travis.
---
Github user gczsjdy commented on the issue:
https://github.com/apache/spark/pull/20809
@vanzin Sorry but I will update it in next week, thanks.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/20809
Do you plan to update this PR? Otherwise it should be closed.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/20809
> This is on travis and no SPARK_HOME is set.
That sounds a little odd. If that is true, then your proposed code wouldn't
work either, since it requires SPARK_HOME to be known.
In
Github user gczsjdy commented on the issue:
https://github.com/apache/spark/pull/20809
@vanzin Thanks. : )
I am testing using [OAP](https://github.com/Intel-bigdata/OAP) with
pre-built Spark on `LocalClusterMode`.
This is on travis and no SPARK_HOME is set.
The `mvn test`
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/20809
Can you provide more information in the bug report? e.g. a sample
application and a sample error.
I don't think this is the correct change, but without your use case I'm not
sure what the
Github user gczsjdy commented on the issue:
https://github.com/apache/spark/pull/20809
@viirya Yes, but this is only for people who will investigate on Spark
code, and it also requires manual efforts. Isn't it better if we get this
automatically?
---
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/20809
For the case, shouldn't we just set `SPARK_SCALA_VERSION`?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For
Github user gczsjdy commented on the issue:
https://github.com/apache/spark/pull/20809
cc @cloud-fan @viirya
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail:
11 matches
Mail list logo