Github user dlyubimov commented on the issue:
https://github.com/apache/zeppelin/pull/928
the only i thing i can think of is if somehow to different versions of
spark or any of its dependencies were loaded into two different
classloaders. I think spark uses a dedicated classloader even for tasks.
but i definitely saw it before and although i do not recollect the context
of what was wrong, eventually i think it was down to that, several versions
of the same class were present, which were transitive dependencies of
something.
This is a very nasty situation to trace, i can remember that too.
On Thu, Jun 16, 2016 at 4:36 PM, Trevor Grant <[email protected]>
wrote:
> UPDATE:
>
> Sorry for the quick one-two punch. But the above error only occurs in
> Spark cluster mode, not in Spark local mode. Leading me to believe jars
> aren't getting loaded up.
>
> â
> You are receiving this because you commented.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/zeppelin/pull/928#issuecomment-226643983>, or
mute
> the thread
>
<https://github.com/notifications/unsubscribe/AAf7_xQ4PcXMJMdHuM1ssSa9q-_EKTc3ks5qMd4XgaJpZM4IpU0W>
> .
>
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---