Hi,

Newbie to pyspark/spark here.

I'm trying to submit a job to pyspark with a dependency. Spark DL in this
case. While the local environment has this the pyspark does not see it. How
do I correctly start pyspark so that it sees this dependency?

Using Spark 2.3.0 in a cloudera setup.

-- 
Regards,
Tharindu Mathew
http://tharindumathew.com

Reply via email to