Github user hvanhovell commented on a diff in the pull request:
https://github.com/apache/spark/pull/16168#discussion_r91065113
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala ---
@@ -126,6 +146,55 @@ private[hive] class
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16099
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16099
**[Test build #69727 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69727/consoleFull)**
for PR 16099 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16099
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69727/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16170
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16170
**[Test build #69726 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69726/consoleFull)**
for PR 16170 at commit
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/16170
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16170
**[Test build #69726 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69726/consoleFull)**
for PR 16170 at commit
Github user hvanhovell commented on the issue:
https://github.com/apache/spark/pull/16170
cc @cloud-fan @viirya
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
GitHub user hvanhovell opened a pull request:
https://github.com/apache/spark/pull/16170
[SPARK-18634][SQL][TRIVIAL] Touch-up Generate
## What changes were proposed in this pull request?
I jumped the gun on merging https://github.com/apache/spark/pull/16120, and
missed a tiny
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/13909
**[Test build #69728 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69728/consoleFull)**
for PR 13909 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16169
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
GitHub user zhengruifeng opened a pull request:
https://github.com/apache/spark/pull/16171
[SPARK-18739][ML][PYSPARK] Models in pyspark.classification support
setXXXCol methods
## What changes were proposed in this pull request?
1, add `setFeaturesCol` and `setPredictionCol` in
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16168
**[Test build #69730 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69730/consoleFull)**
for PR 16168 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16170
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69726/
Test PASSed.
---
Github user hvanhovell commented on the issue:
https://github.com/apache/spark/pull/16170
Thanks for the review. Merging this one.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/16170
@hvanhovell Thanks for helping fixing this I missed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user sethah commented on a diff in the pull request:
https://github.com/apache/spark/pull/16037#discussion_r91061168
--- Diff:
mllib/src/main/scala/org/apache/spark/mllib/optimization/LBFGS.scala ---
@@ -241,16 +241,27 @@ object LBFGS extends Logging {
val bcW =
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16169
**[Test build #69725 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69725/consoleFull)**
for PR 16169 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16169
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69725/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16168
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16138
**[Test build #69737 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69737/consoleFull)**
for PR 16138 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16092
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69735/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16174
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69736/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16138
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69739/
Test FAILed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16138
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16138
**[Test build #69739 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69739/consoleFull)**
for PR 16138 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16175
**[Test build #3469 has
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3469/consoleFull)**
for PR 16175 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16138
**[Test build #69747 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69747/consoleFull)**
for PR 16138 at commit
Github user anabranch commented on the issue:
https://github.com/apache/spark/pull/16138
Now that my outputs are correct (in format), there's a new problem. The
types are *still* wrong.
```
scala> /// DETAILS
scala> // Schema
scala>
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/16092#discussion_r91180762
--- Diff: dev/sparktestsupport/modules.py ---
@@ -469,7 +469,7 @@ def __hash__(self):
name="yarn",
dependencies=[],
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/16171
Hm, isn't this the third or fourth PR adding these setters to ML classes?
As I said on the last one, we do not want to keep opening different issues to
address quite logically related concerns. I
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16149
**[Test build #3470 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3470/consoleFull)**
for PR 16149 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16176
**[Test build #69742 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69742/consoleFull)**
for PR 16176 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16176
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69742/
Test PASSed.
---
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/16082
Merged to master, and 2.1 as it's relevant to packaging of the release
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16176
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/16149#discussion_r91168760
--- Diff:
mllib/src/main/scala/org/apache/spark/ml/regression/GeneralizedLinearRegression.scala
---
@@ -479,7 +479,12 @@ object GeneralizedLinearRegression
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16149
**[Test build #3470 has
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3470/consoleFull)**
for PR 16149 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16128
**[Test build #69740 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69740/consoleFull)**
for PR 16128 at commit
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/16137#discussion_r91169287
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -670,10 +677,14 @@ class SparkContext(config: SparkConf) extends Logging
{
*
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/16137#discussion_r91169111
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -64,11 +64,11 @@ import org.apache.spark.util._
* Main entry point for Spark
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/16137#discussion_r91170321
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -953,24 +977,24 @@ class SparkContext(config: SparkConf) extends Logging
{
}
GitHub user aray opened a pull request:
https://github.com/apache/spark/pull/16177
[SPARK-17760][SQL] AnalysisException with dataframe pivot when groupBy
column is not attribute
## What changes were proposed in this pull request?
Fixes AnalysisException for pivot queries
Github user JoshRosen commented on the issue:
https://github.com/apache/spark/pull/15923
This looks good overall, but one nit: it looks like we don't have any test
coverage for the case where `detectCorrupt` is false. We should probably add a
test to make sure that the feature flag
Github user falaki commented on a diff in the pull request:
https://github.com/apache/spark/pull/16154#discussion_r91169543
--- Diff: core/src/main/scala/org/apache/spark/api/r/RBackendHandler.scala
---
@@ -143,12 +142,8 @@ private[r] class RBackendHandler(server: RBackend)
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16177
**[Test build #69743 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69743/consoleFull)**
for PR 16177 at commit
Github user anabranch commented on a diff in the pull request:
https://github.com/apache/spark/pull/16138#discussion_r91172169
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
---
@@ -1047,6 +1047,53 @@ case class
Github user pwoody commented on the issue:
https://github.com/apache/spark/pull/15614
@JoshRosen unfortunately the other PR got closed. thoughts on this
independently?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user anabranch commented on a diff in the pull request:
https://github.com/apache/spark/pull/16138#discussion_r91173386
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
---
@@ -1047,6 +1047,53 @@ case class
Github user falaki commented on a diff in the pull request:
https://github.com/apache/spark/pull/16154#discussion_r91173424
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -272,18 +282,22 @@ private[spark] object SerDe {
}
}
-
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16161
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/16173
You certainly have a point there, but I also don't know that there's
actually an intent to do the work to truly support and maintain an SPI for
broadcasts.
---
If your project is set up for it,
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/16109#discussion_r91176023
--- Diff:
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/CachedKafkaConsumer.scala
---
@@ -190,10 +194,31 @@ private[kafka010] case
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/16167
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user anabranch commented on the issue:
https://github.com/apache/spark/pull/16138
and I found the error, I shouldn't be overriding the `DataType`.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/16168
@jiangxb1987 - please take a look at my comment on the jira ticket:
https://issues.apache.org/jira/browse/SPARK-18209
---
If your project is set up for it, you can reply to this email and have
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16128
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69740/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16128
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/16137#discussion_r91170210
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -901,6 +926,7 @@ class SparkContext(config: SparkConf) extends Logging {
*
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/16137#discussion_r91169186
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -358,9 +361,11 @@ class SparkContext(config: SparkConf) extends Logging {
}
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/16128
LGTM. Thanks. Merging to master and 2.1.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/16128
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/15736
It does seem like nice cleanup in any event. I am not sure why the first
commit was faster as this seems like a 'superset' of optimization. We can't use
that one in any event. If you want to update
Github user falaki commented on a diff in the pull request:
https://github.com/apache/spark/pull/16154#discussion_r91171383
--- Diff: core/src/main/scala/org/apache/spark/api/r/JVMObjectTracker.scala
---
@@ -0,0 +1,65 @@
+/*
+ * Licensed to the Apache Software Foundation
Github user falaki commented on a diff in the pull request:
https://github.com/apache/spark/pull/16154#discussion_r91170861
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -272,18 +282,22 @@ private[spark] object SerDe {
}
}
-
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/16103
Merged to master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16068
**[Test build #69744 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69744/consoleFull)**
for PR 16068 at commit
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/16103
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16161
**[Test build #69741 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69741/consoleFull)**
for PR 16161 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/16161
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/69741/
Test PASSed.
---
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/16165
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16165
**[Test build #69745 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69745/consoleFull)**
for PR 16165 at commit
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/16167
Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16109
**[Test build #69746 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69746/consoleFull)**
for PR 16109 at commit
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/16178
Fix deadlock when SparkContext.stop is called in Utils.tryOrStopSparkContext
## What changes were proposed in this pull request?
When `SparkContext.stop` is called in
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/16178
cc @rxin
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16178
**[Test build #69750 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69750/consoleFull)**
for PR 16178 at commit
Github user weiqingy commented on the issue:
https://github.com/apache/spark/pull/16176
@gatorsmile Could you please review this PR? Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user marmbrus commented on a diff in the pull request:
https://github.com/apache/spark/pull/16178#discussion_r91190778
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1760,25 +1760,24 @@ class SparkContext(config: SparkConf) extends
Logging {
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/16178#discussion_r91191951
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1760,25 +1760,24 @@ class SparkContext(config: SparkConf) extends
Logging {
Github user aray commented on the issue:
https://github.com/apache/spark/pull/16161
Right now it's not supported to have the following:
```
case class Foo(a: Map[Int, Int])
```
(using the scala Predef version of Map)
The
Github user hvanhovell closed the pull request at:
https://github.com/apache/spark/pull/16174
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/16178#discussion_r91193208
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1760,25 +1760,24 @@ class SparkContext(config: SparkConf) extends
Logging {
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16178
**[Test build #69751 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69751/consoleFull)**
for PR 16178 at commit
Github user tnachen commented on the issue:
https://github.com/apache/spark/pull/15684
LGTM, @srowen can you help on this?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user tnachen commented on the issue:
https://github.com/apache/spark/pull/14936
@philipphoffmann Sorry for the long delay, one last ask. Can you add a
simple unit test to verify it works?
---
If your project is set up for it, you can reply to this email and have your
reply
Github user gatorsmile commented on the issue:
https://github.com/apache/spark/pull/16176
cc @cloud-fan
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so,
Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/16176#discussion_r91194841
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLImplicits.scala
---
@@ -74,6 +74,9 @@ abstract class SQLImplicits {
/** @since 1.6.0 */
Github user tnachen commented on a diff in the pull request:
https://github.com/apache/spark/pull/12933#discussion_r91195201
--- Diff:
mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala
---
@@ -50,6 +50,44 @@ trait MesosSchedulerUtils extends
Github user tnachen commented on the issue:
https://github.com/apache/spark/pull/12933
@srowen can you help review this? Besides my minor comment overall it looks
fine with me.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user tnachen commented on the issue:
https://github.com/apache/spark/pull/13077
@devaraj-kavali let us know if you can still update this, otherwise I'll
close this as it's no longer being updated.
---
If your project is set up for it, you can reply to this email and have your
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/16014#discussion_r91196396
--- Diff: dev/create-release/release-build.sh ---
@@ -221,14 +235,13 @@ if [[ "$1" == "package" ]]; then
# We increment the Zinc port each
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16175
**[Test build #3469 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3469/consoleFull)**
for PR 16175 at commit
Github user gatorsmile commented on the issue:
https://github.com/apache/spark/pull/16175
cc @cloud-fan
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so,
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/16068
**[Test build #69744 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/69744/consoleFull)**
for PR 16068 at commit
501 - 596 of 596 matches
Mail list logo