I can reproduce it in 0.6.1 & master branch, please file a ticket for that.

On Wed, Aug 17, 2016 at 4:09 AM, Michael Sells <mjse...@gmail.com> wrote:

> Testing out 0.6.1 with Spark 2.0 and discovered the way we load
> dependencies doesn't seem to be working with the new update.
>
> We pass new dependencies in via a SPARK_SUBMIT_OPTIONS environment
> variable pass the following flags:
> --packages com.databricks:spark-avro_2.11:3.0.0
>
> Now when I try to import it with:
> import com.databricks.spark.avro._
>
> I get:
> <console>:25: error: object databricks is not a member of package com
> import com.databricks.spark.avro._
>
> I checked the logs are there is no error retrieving the package. So it
> seems to be something with the classpath.
>
> This works in 0.6.0. Any idea if something changed or if we're doing
> something wrong? I tried this with a few internal packages as well and it
> doesn't work with those either.
>
> Thanks,
> Mike
>
>
>
>
>
>


-- 
Best Regards

Jeff Zhang

Reply via email to