I also could reproduce it with Spark 2.0.0, but not with Spark 1.6.1.
If you want to use Zeppelin with Spark 2.0, one alternative you can try is
using [1] "dependencies" in GUI interpreter menu.

[1] http://zeppelin.apache.org/docs/0.6.1/manual/dependencymanagement.html

On Wed, Aug 17, 2016 at 1:46 AM Jeff Zhang <zjf...@gmail.com> wrote:

> I can reproduce it in 0.6.1 & master branch, please file a ticket for
> that.
>
> On Wed, Aug 17, 2016 at 4:09 AM, Michael Sells <mjse...@gmail.com> wrote:
>
>> Testing out 0.6.1 with Spark 2.0 and discovered the way we load
>> dependencies doesn't seem to be working with the new update.
>>
>> We pass new dependencies in via a SPARK_SUBMIT_OPTIONS environment
>> variable pass the following flags:
>> --packages com.databricks:spark-avro_2.11:3.0.0
>>
>> Now when I try to import it with:
>> import com.databricks.spark.avro._
>>
>> I get:
>> <console>:25: error: object databricks is not a member of package com
>> import com.databricks.spark.avro._
>>
>> I checked the logs are there is no error retrieving the package. So it
>> seems to be something with the classpath.
>>
>> This works in 0.6.0. Any idea if something changed or if we're doing
>> something wrong? I tried this with a few internal packages as well and it
>> doesn't work with those either.
>>
>> Thanks,
>> Mike
>>
>>
>>
>>
>>
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Reply via email to