Hi,
regarding 1, packages are resolved locally. That means that when you
specify a package, spark-submit will resolve the dependencies and
download any jars on the local machine, before shipping* them to the
cluster. So, without a priori knowledge of dataproc clusters, it
should be no different to specify packages.

Unfortunatly I can't help with 2.

--Jakob

*shipping in this case means making them available via the network

On Thu, Mar 17, 2016 at 5:36 PM, Ajinkya Kale <kaleajin...@gmail.com> wrote:
> Hi all,
>
> I had couple of questions.
> 1. Is there documentation on how to add the graphframes or any other package
> for that matter on the google dataproc managed spark clusters ?
>
> 2. Is there a way to add a package to an existing pyspark context through a
> jupyter notebook ?
>
> --aj

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to