Re: Interpreter dependency not loading?

2016-03-09 Thread Chris Miller
Yeah, that isn't very clear. I'll improve it.


--
Chris Miller

On Wed, Mar 9, 2016 at 5:12 PM, mina lee  wrote:

> Glad to hear that it works!
> Actually, there is document
> https://zeppelin.incubator.apache.org/docs/0.5.5-incubating/interpreter/spark.html
>  in
> dependency management section, but obviously it seems hard to find for new
> users. So feel free to improve it.
>
>
> On Wed, Mar 9, 2016 at 6:05 PM Chris Miller 
> wrote:
>
>> Oh, I see. Yeah, that's not documented... no wonder it's confusing. I'll
>> open a PR with some improvements to the documentation for this case when I
>> have a moment.
>>
>> Changing spark-default.conf as you suggested indeed worked. Thanks!
>>
>>
>> --
>> Chris Miller
>>
>> On Wed, Mar 9, 2016 at 10:04 AM, mina lee  wrote:
>>
>>> Hi Chris,
>>>
>>> there are several ways to load dependencies to Zeppelin 0.5.5.
>>> Using %dep is one of them.
>>> If you want do it by setting spark.jars.packages property, proper way of
>>> doing it is editing your SPARK_HOME/conf/spark-default.conf
>>> and adding below line.(I assume that you set SPARK_HOME in
>>> ZEPPELIN_HOME/conf/zeppelin-env.sh)
>>>
>>> spark.jars.packages   org.apache.avro:avro:1.8.0,org.
>>> joda:joda-convert:1.8.1
>>>
>>> The reason you can import avro dependency is that spark assembly already
>>> includes avro dependencies, not because you added it in Zeppelin
>>> interpreter setting.
>>>
>>> You can add dependencies via GUI with the latest master
>>> branch(0.6.0-incubating-SNAPSHOT) which is experimental at the moment.
>>> Please let me know it answers your question.
>>>
>>> Regards,
>>> Mina
>>>
>>> On Wed, Mar 9, 2016 at 1:41 AM Chris Miller 
>>> wrote:
>>>
 Hi,

 I have a strange situation going on. I'm running Zeppelin 0.5.5 and
 Spark 1.6.0 (on Amazon EMR). I added this property to the interpreter
 settings (and restarted it):


 spark.jars.packages: org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1

 The avro dependency loads fine and I'm able to import and use it.
 However, if I try to import something in the joda-convert package (such as,
 org.joda.convert.FromString), I get an error that "error: object convert is
 not a member of package org.joda".

 If I run the spark-shell from the CLI and include the same string above
 in the --package parameter, I'm able to import joda-convert just fine.
 Also, if I restart the interpreter and manually import the dependency with
 z.load(), it also works fine:

 %dep
 z.load("org.joda:joda-convert:1.8.1")

 So, what's going on here?

 --
 Chris Miller

>>>
>>


Re: Interpreter dependency not loading?

2016-03-09 Thread mina lee
Glad to hear that it works!
Actually, there is document
https://zeppelin.incubator.apache.org/docs/0.5.5-incubating/interpreter/spark.html
in
dependency management section, but obviously it seems hard to find for new
users. So feel free to improve it.


On Wed, Mar 9, 2016 at 6:05 PM Chris Miller  wrote:

> Oh, I see. Yeah, that's not documented... no wonder it's confusing. I'll
> open a PR with some improvements to the documentation for this case when I
> have a moment.
>
> Changing spark-default.conf as you suggested indeed worked. Thanks!
>
>
> --
> Chris Miller
>
> On Wed, Mar 9, 2016 at 10:04 AM, mina lee  wrote:
>
>> Hi Chris,
>>
>> there are several ways to load dependencies to Zeppelin 0.5.5.
>> Using %dep is one of them.
>> If you want do it by setting spark.jars.packages property, proper way of
>> doing it is editing your SPARK_HOME/conf/spark-default.conf
>> and adding below line.(I assume that you set SPARK_HOME in
>> ZEPPELIN_HOME/conf/zeppelin-env.sh)
>>
>> spark.jars.packages   org.apache.avro:avro:1.8.0,org.
>> joda:joda-convert:1.8.1
>>
>> The reason you can import avro dependency is that spark assembly already
>> includes avro dependencies, not because you added it in Zeppelin
>> interpreter setting.
>>
>> You can add dependencies via GUI with the latest master
>> branch(0.6.0-incubating-SNAPSHOT) which is experimental at the moment.
>> Please let me know it answers your question.
>>
>> Regards,
>> Mina
>>
>> On Wed, Mar 9, 2016 at 1:41 AM Chris Miller 
>> wrote:
>>
>>> Hi,
>>>
>>> I have a strange situation going on. I'm running Zeppelin 0.5.5 and
>>> Spark 1.6.0 (on Amazon EMR). I added this property to the interpreter
>>> settings (and restarted it):
>>>
>>>
>>> spark.jars.packages: org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1
>>>
>>> The avro dependency loads fine and I'm able to import and use it.
>>> However, if I try to import something in the joda-convert package (such as,
>>> org.joda.convert.FromString), I get an error that "error: object convert is
>>> not a member of package org.joda".
>>>
>>> If I run the spark-shell from the CLI and include the same string above
>>> in the --package parameter, I'm able to import joda-convert just fine.
>>> Also, if I restart the interpreter and manually import the dependency with
>>> z.load(), it also works fine:
>>>
>>> %dep
>>> z.load("org.joda:joda-convert:1.8.1")
>>>
>>> So, what's going on here?
>>>
>>> --
>>> Chris Miller
>>>
>>
>


Re: Interpreter dependency not loading?

2016-03-09 Thread Chris Miller
Oh, I see. Yeah, that's not documented... no wonder it's confusing. I'll
open a PR with some improvements to the documentation for this case when I
have a moment.

Changing spark-default.conf as you suggested indeed worked. Thanks!


--
Chris Miller

On Wed, Mar 9, 2016 at 10:04 AM, mina lee  wrote:

> Hi Chris,
>
> there are several ways to load dependencies to Zeppelin 0.5.5.
> Using %dep is one of them.
> If you want do it by setting spark.jars.packages property, proper way of
> doing it is editing your SPARK_HOME/conf/spark-default.conf
> and adding below line.(I assume that you set SPARK_HOME in
> ZEPPELIN_HOME/conf/zeppelin-env.sh)
>
> spark.jars.packages   org.apache.avro:avro:1.8.0,org.
> joda:joda-convert:1.8.1
>
> The reason you can import avro dependency is that spark assembly already
> includes avro dependencies, not because you added it in Zeppelin
> interpreter setting.
>
> You can add dependencies via GUI with the latest master
> branch(0.6.0-incubating-SNAPSHOT) which is experimental at the moment.
> Please let me know it answers your question.
>
> Regards,
> Mina
>
> On Wed, Mar 9, 2016 at 1:41 AM Chris Miller 
> wrote:
>
>> Hi,
>>
>> I have a strange situation going on. I'm running Zeppelin 0.5.5 and Spark
>> 1.6.0 (on Amazon EMR). I added this property to the interpreter settings
>> (and restarted it):
>>
>>
>> spark.jars.packages: org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1
>>
>> The avro dependency loads fine and I'm able to import and use it.
>> However, if I try to import something in the joda-convert package (such as,
>> org.joda.convert.FromString), I get an error that "error: object convert is
>> not a member of package org.joda".
>>
>> If I run the spark-shell from the CLI and include the same string above
>> in the --package parameter, I'm able to import joda-convert just fine.
>> Also, if I restart the interpreter and manually import the dependency with
>> z.load(), it also works fine:
>>
>> %dep
>> z.load("org.joda:joda-convert:1.8.1")
>>
>> So, what's going on here?
>>
>> --
>> Chris Miller
>>
>


Re: Interpreter dependency not loading?

2016-03-08 Thread vincent gromakowski
Hi
Clearly the dependencies management should be clarified because its not
clear which method override which one  specially when you have conflict the
order of libs is important in the classpath...
Le 9 mars 2016 03:04, "mina lee"  a écrit :

> Hi Chris,
>
> there are several ways to load dependencies to Zeppelin 0.5.5.
> Using %dep is one of them.
> If you want do it by setting spark.jars.packages property, proper way of
> doing it is editing your SPARK_HOME/conf/spark-default.conf
> and adding below line.(I assume that you set SPARK_HOME in
> ZEPPELIN_HOME/conf/zeppelin-env.sh)
>
> spark.jars.packages   org.apache.avro:avro:1.8.0,org.
> joda:joda-convert:1.8.1
>
> The reason you can import avro dependency is that spark assembly already
> includes avro dependencies, not because you added it in Zeppelin
> interpreter setting.
>
> You can add dependencies via GUI with the latest master
> branch(0.6.0-incubating-SNAPSHOT) which is experimental at the moment.
> Please let me know it answers your question.
>
> Regards,
> Mina
>
> On Wed, Mar 9, 2016 at 1:41 AM Chris Miller 
> wrote:
>
>> Hi,
>>
>> I have a strange situation going on. I'm running Zeppelin 0.5.5 and Spark
>> 1.6.0 (on Amazon EMR). I added this property to the interpreter settings
>> (and restarted it):
>>
>>
>> spark.jars.packages: org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1
>>
>> The avro dependency loads fine and I'm able to import and use it.
>> However, if I try to import something in the joda-convert package (such as,
>> org.joda.convert.FromString), I get an error that "error: object convert is
>> not a member of package org.joda".
>>
>> If I run the spark-shell from the CLI and include the same string above
>> in the --package parameter, I'm able to import joda-convert just fine.
>> Also, if I restart the interpreter and manually import the dependency with
>> z.load(), it also works fine:
>>
>> %dep
>> z.load("org.joda:joda-convert:1.8.1")
>>
>> So, what's going on here?
>>
>> --
>> Chris Miller
>>
>


Re: Interpreter dependency not loading?

2016-03-08 Thread mina lee
Hi Chris,

there are several ways to load dependencies to Zeppelin 0.5.5.
Using %dep is one of them.
If you want do it by setting spark.jars.packages property, proper way of
doing it is editing your SPARK_HOME/conf/spark-default.conf
and adding below line.(I assume that you set SPARK_HOME in
ZEPPELIN_HOME/conf/zeppelin-env.sh)

spark.jars.packages   org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1

The reason you can import avro dependency is that spark assembly already
includes avro dependencies, not because you added it in Zeppelin
interpreter setting.

You can add dependencies via GUI with the latest master
branch(0.6.0-incubating-SNAPSHOT) which is experimental at the moment.
Please let me know it answers your question.

Regards,
Mina

On Wed, Mar 9, 2016 at 1:41 AM Chris Miller  wrote:

> Hi,
>
> I have a strange situation going on. I'm running Zeppelin 0.5.5 and Spark
> 1.6.0 (on Amazon EMR). I added this property to the interpreter settings
> (and restarted it):
>
> spark.jars.packages: org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1
>
> The avro dependency loads fine and I'm able to import and use it. However,
> if I try to import something in the joda-convert package (such as,
> org.joda.convert.FromString), I get an error that "error: object convert is
> not a member of package org.joda".
>
> If I run the spark-shell from the CLI and include the same string above in
> the --package parameter, I'm able to import joda-convert just fine. Also,
> if I restart the interpreter and manually import the dependency with
> z.load(), it also works fine:
>
> %dep
> z.load("org.joda:joda-convert:1.8.1")
>
> So, what's going on here?
>
> --
> Chris Miller
>