Github user MattWhelan commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-74587061
@srowen I forgot about this one :) This should be covered by @vanzin's
https://github.com/apache/spark/pull/3233
---
If your project is set up for it, you can reply
Github user MattWhelan closed the pull request at:
https://github.com/apache/spark/pull/4165
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-74586743
@MattWhelan Given that you closed https://github.com/apache/spark/pull/4166
is this live too? and/or is SPARK-5358 live or resolved already?
---
If your project is set up
Github user MattWhelan commented on the pull request:
https://github.com/apache/spark/pull/4166#issuecomment-74583994
Hey, sorry I was away for a bit.
After an extended discussion with Vanzin on his
https://github.com/apache/spark/pull/3233, I'm convinced that PR covers the
Github user MattWhelan closed the pull request at:
https://github.com/apache/spark/pull/4166
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/4166#issuecomment-73424928
If you bring this one up to date we may be able to slip it into 1.3. Just
let me know, thanks!
---
If your project is set up for it, you can reply to this email and hav
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/4166#discussion_r24301030
--- Diff:
core/src/main/scala/org/apache/spark/executor/ExecutorURLClassLoader.scala ---
@@ -19,56 +19,51 @@ package org.apache.spark.executor
imp
Github user stephenh commented on the pull request:
https://github.com/apache/spark/pull/4166#issuecomment-72877246
@MattWhelan I believe you that loadClass is preferable to findClass, but
can you articulate why? And maybe as a comment on this SO post so that future
people that copy/p
Github user MattWhelan commented on the pull request:
https://github.com/apache/spark/pull/4166#issuecomment-71115860
BTW, I spent a few minutes pondering the deadlock scenario, and the
delegation changes you see here. I'm pretty sure we're safe with 1.6-style
coarse locking, because
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/4166#issuecomment-71115098
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your pro
GitHub user MattWhelan opened a pull request:
https://github.com/apache/spark/pull/4166
SPARK-5358: Rework the classloader impelementation.
The fundamental issue is that you can't change the delegation scheme
without overriding loadClass (rather than findClass). And, if you
o
Github user MattWhelan commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-71107518
@vanzin
There needs to be some locking. The 1.6 version of loadClass is simply a
synchronized method (on 'this'). That's a viable alternative, and the way I'll
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-71106397
@MattWhelan I see. Still, I'm not sure we need to call that method at all.
From the docs:
In environments in which the delegation model is not strictly
hierar
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-71105956
@MattWhelan I see. Still, from reading the docs, it's not clear why you
have to call that method. What happens if you don't? Does the JVM do locking
for you (potentially s
Github user MattWhelan commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-71105198
@vanzin
The way the registration method works is also really weird. It takes no
params, you'll notice. In ClassLoader, it calls a Reflection native method to
ac
Github user MattWhelan commented on a diff in the pull request:
https://github.com/apache/spark/pull/4165#discussion_r23411517
--- Diff:
core/src/main/java/org/apache/spark/classloader/GreedyUrlClassLoader.java ---
@@ -0,0 +1,61 @@
+package org.apache.spark.classloader;
---
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/4165#discussion_r23408475
--- Diff:
core/src/main/java/org/apache/spark/classloader/GreedyUrlClassLoader.java ---
@@ -0,0 +1,61 @@
+package org.apache.spark.classloader;
--- End
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-71093445
The 1.7 dependency will probably be shot down pretty quickly. But one way
you could work around it in Scala is to have a factory method to create
instances of this classlo
Github user nchammas commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-71092175
I'll leave it to a maintainer to confirm, but I believe that Java 6 support
is an official commitment of the Spark project. Dropping that support would be
a major decisi
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/4165#issuecomment-71089809
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your pro
GitHub user MattWhelan opened a pull request:
https://github.com/apache/spark/pull/4165
SPARK-5358: Rework the classloader impelementation.
The fundamental issue is that you can't change the delegation scheme
without overriding loadClass (rather than findClass). And, if you
o
21 matches
Mail list logo