Use --py-files

See
https://spark.apache.org/docs/latest/submitting-applications.html#bundling-your-applications-dependencies

I hope that helps.

On Tue, 28 Jan 2020, 9:46 am Tharindu Mathew, <tharindu.mat...@gmail.com>
wrote:

> Hi,
>
> Newbie to pyspark/spark here.
>
> I'm trying to submit a job to pyspark with a dependency. Spark DL in this
> case. While the local environment has this the pyspark does not see it. How
> do I correctly start pyspark so that it sees this dependency?
>
> Using Spark 2.3.0 in a cloudera setup.
>
> --
> Regards,
> Tharindu Mathew
> http://tharindumathew.com
>

Reply via email to