Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/100
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabl
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37136247
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37136248
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13083/
---
If your project
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37134276
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not hav
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37134277
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37133631
Okay I'm going to merge this into master. Did some local testing and things
seem to be working.
---
If your project is set up for it, you can reply to this email and hav
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37105043
@mateiz this works fine in Java 8 unit tests.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your proj
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37087149
@koertkuipers so I looked at chill and they don't use ASM except inside of
the ClosureCleaner (which they actually borrowed from Spark). Since we don't
use chill's closur
Github user koertkuipers commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37086194
ah got it, thanks. so asm 3.x will be on classpath wether we like it or
not. and we remove all other asm dependencies here, except for a kryo version.
will ch
Github user sryza commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37085469
That only works when getting Hadoop dependencies by packaging everything in
an uber jar. On yarn, for example, the Hadoop jars and dependencies are pulled
in by pointing to
Github user koertkuipers commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37085331
i though the old rule (organization = "asm") would be sufficient to keep
that out? i am clearly missing something, sorry...
---
If your project is set up for it, you
Github user sryza commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37085083
Newer Hadoop versions pull in ASM 3.1 (through Jersey)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If you
Github user koertkuipers commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37085029
"This made all of the old rules not work with newer Hadoop
versions that pull in new asm versions."
curious, what new asm versions were pulled in by newer
Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37084570
Will this also work on Java 8?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user sryza commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37082518
This patch looks good to me. ASM still needs to be excluded from chill,
right?
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37082365
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13055/
---
If your project
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37082364
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37079425
Come to think of it, we may want to stop excluding asm now since we don't
directly use it anymore (therefore there can be no conflicts w/ Spark).
---
If your project is
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37079388
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not hav
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/100#issuecomment-37079389
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
GitHub user pwendell opened a pull request:
https://github.com/apache/spark/pull/100
SPARK-782 Clean up for ASM dependency.
This makes two changes.
1) Spark uses the shaded version of asm that is (conveniently) published
with Kryo.
2) Existing exclude rules around
21 matches
Mail list logo