Github user tdas commented on the issue:
https://github.com/apache/spark/pull/14041
Since build passed, merging this to master and 2.0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/14041
**[Test build #3162 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3162/consoleFull)**
for PR 14041 at commit
Github user tdas commented on the issue:
https://github.com/apache/spark/pull/14041
yeah. @zsxwing is looking into it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user koeninger commented on the issue:
https://github.com/apache/spark/pull/14041
From this + looking at jenkins, it seems like master is broken
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/14041
**[Test build #3162 has
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3162/consoleFull)**
for PR 14041 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/14041
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/14041
**[Test build #61775 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/61775/consoleFull)**
for PR 14041 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/14041
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/61775/
Test FAILed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/14041
**[Test build #61775 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/61775/consoleFull)**
for PR 14041 at commit
Github user koeninger commented on the issue:
https://github.com/apache/spark/pull/14041
That build error looks hive related... I can try merging latest master
@tdas yes, the reason unidoc is failing is because it's throwing all
dependencies from all subprojects into one
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/14041
**[Test build #3161 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3161/consoleFull)**
for PR 14041 at commit
Github user tdas commented on the issue:
https://github.com/apache/spark/pull/14041
In any case, we have to release rc2 soon and that cannot be done with a
broken unidoc. And between 0.8 and 0.10, 0.8 is higher priority for having docs
because it is stable API. So LGTM for this PR,
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/14041
**[Test build #3161 has
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3161/consoleFull)**
for PR 14041 at commit
Github user tdas commented on the issue:
https://github.com/apache/spark/pull/14041
Does anyone know why does unidoc fail? Is it because unidoc is combining
both kafka 0.8 and 0.10 in the compile path and therefore causing problems?
---
If your project is set up for it, you can
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/14041
Same error, yes.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14041
I just noticed that our nightly docs build has been failing with an error
related to kafka (Example [1]). Will this PR fix this or should we open a new
JIRA for this ?
[1]
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/14041
I suppose publishing one version's docs is better than none. Is it possible
to publish 0.10 only?
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user koeninger commented on the issue:
https://github.com/apache/spark/pull/14041
Keeping the 0.10 classes might work if we want to skip publishing 0.8, but
trying to skip publishing 0.10 did not work until I modified the classpath
for the unidoc task. 0.8 would error,
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/14041
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/14041
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/61708/
Test FAILed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/14041
**[Test build #61708 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/61708/consoleFull)**
for PR 14041 at commit
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/14041
It doesn't work to only keep the 0.10 classes? I was hoping they'd be a
superset of 0.8 for these purposes. Can we skip generating javadoc for one of
them rather than modify the classpath, if
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/14041
**[Test build #61708 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/61708/consoleFull)**
for PR 14041 at commit
23 matches
Mail list logo