Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/322
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabl
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-40298968
Thanks for the fix - merged into master and 1.0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your pr
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-40285724
Merged build finished. All automated tests passed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-40285726
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14078/
---
If your project
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-40284773
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not ha
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-40284781
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-40284736
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-40125951
Jenkins, retest this please.
I'm fine to have this is a workaround for 1.0. I'll make a JIRA describing
the broader issue and we can fix it for 1.1.
---
If your
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39976847
@pwendell: Do you plan to pick this up for 1.0? Is there anything more I
need to do?
---
If your project is set up for it, you can reply to this email and have your
r
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39720333
So the current fix looks fine?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user ueshin commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39700076
@pwendell Yes, the bootstrap class loader knows only core Java APIs and the
Spark classes (specified by `-cp` java command argument) are loaded by the
system class loader.
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39699792
Ah I see, @ueshin you are right. It's the bootstrap class loader and it
won't have any spark definitions. I was mixing this up with the system class
loader.
```
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39699481
Okay I think the broader issue is that
@ueshin you said before that "`Class.forName()` method with `null` for 3rd
argument tries to load class from bootstrap cla
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39689262
@pwendell: You are right. Actually
`sun.misc.Launcher$AppClassLoader@12360be0` is the classloader even in the
earlier code.
Looks like classes directly loaded
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39676705
Another way of putting my question. Currently there is a line:
```
Thread.currentThread.setContextClassLoader(getClass.getClassLoader)
```
What is the actua
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39676625
I guess what I don't understand about the current fix is, where is the
context class loader getting changed from the boostrap class loader in the
first place? The current
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39676303
@pwendell: I tested your fix to SparkEnv.scala (after reverting my earlier
change). It does not work. SparkEnv's loader turns out to be
`sun.misc.Launcher$AppClassLoad
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39674063
Hey @ueshin @manku-timma - as a simpler fix, would it be possible to just
change the way SparkEnv captures the class loader? I think it was probably just
an oversight whe
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39655348
I see that PR 334 made the java 6 change. So I reverted mine.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub a
Github user ueshin commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39643866
LGTM about the class loader issue.
But I'm not sure the last fix i.e. Java7 API may be used or not.
---
If your project is set up for it, you can reply to this email an
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39641219
Let me know if there is any other change I need to make. I have tested
after merging from master and things look fine. This is good to be merged from
my end.
---
If
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39627690
Oops. Added that line.
I am facing this error in the current git tree
```
[error]
/home/vagrant/spark2/sql/core/src/main/scala/org/apache/spark/sq
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/322#discussion_r11318644
--- Diff:
core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala ---
@@ -50,13 +50,18 @@ private[spark] class MesosExecutorBackend
Github user ueshin commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39560562
@manku-timma, I just wondered that we should restore the class loader in
`registered` method like:
```scala
val cl = Thread.currentThread.getContextClassLoader
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39552826
@ueshin, your one line change works for me.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your pro
Github user ueshin commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39551604
BTW, we might have to restore the context class loader of
`MesosExecutorBackend` to bootstrap class loader.
We should restore if there are 2 or more threads to handle
`
Github user ueshin commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39550942
@manku-timma Yes, the standalone backend
`org.apache.spark.executor.CoarseGrainedExecutorBackend` and the local-mode
backend `org.apache.spark.scheduler.local.LocalActor` h
Github user ueshin commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39546729
Put the following line at the beginning of `registered` method (at [line
53](https://github.com/manku-timma/spark-1/blob/89109d7bf2ce35e58a252bd7144438fc720ee362/core/src/ma
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39544602
From my limited understanding, Executor is used by the mesos backend and
the standalone backend. If mesos backend has its own class loader, will that be
enough? Does t
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39543590
java.lang.ClassNotFoundException: org/apache/spark/serializer/JavaSerializer
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forNam
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39543328
@ueshin - ah cool. Do you mind giving a code snippet of what that would
look like? Then @manku-timma can see if it fixes the problem. Probably is a
better solution...
-
Github user ueshin commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39543185
Hi,
I think `MesosExecutorBackend` should have own `ContextClassLoader` and it
should be set to the class loader of `MesosExecutorBackend` class before it
creates `Exec
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39543041
Which exact failure is it? Could you post the stack trace?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well.
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39541428
I can confirm that the third line is needed. Without that line I see the
same failure as earlier.
---
If your project is set up for it, you can reply to this email an
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39535193
I am testing out if removing the last line is ok.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If yo
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39533417
@ueshin removed it in SPARK-1210:
https://github.com/apache/spark/pull/15
I proposed a more conservative change in this comment:
https://github.com/apache/
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39533342
This patch also adds back a line that we earlier removed:
```
Thread.currentThread.setContextClassLoader(replClassLoader)
```
Does your fix require th
Github user manku-timma commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39530349
After looking at the code a bit more, I see that the code to
setContextClassLoader does not use SecurityManager AFAIS. createClassLoader is
creating a File object. add
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/322#issuecomment-39526912
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your proj
GitHub user manku-timma opened a pull request:
https://github.com/apache/spark/pull/322
[SPARK-1403] Move the class loader creation back to where it was in 0.9.0
[SPARK-1403] I investigated why spark 0.9.0 loads fine on mesos while spark
1.0.0 fails. What I found was that in SparkEn
40 matches
Mail list logo