On Wed, Dec 3, 2014 at 8:17 PM, chocjy <jiyanyan...@gmail.com> wrote:
> Hi,
>
> I am using spark with version number 1.1.0 on an EC2 cluster. After I
> submitted the job, it returned an error saying that a python module cannot
> be loaded due to missing files. I am using the same command that used to
> work on an private cluster before for submitting jobs and all the source
> files are located in the current working directory. I also tried to add the
> current path to $PYTHONPATH but it didn't help.
>
> In the outputs, there is nothing coming after the line
> "spark.submit.pyFiles=". The command I used is
> spark-submit --executor-memory 7G --driver-memory 8G l2_exp.py --py-files
> a.py,b.py,c.py

--py-files should be placed before your script l2_exp.py, or it will
became arguments of l2_exp.py.

> Anyone knows how to fix this?
>
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/cannot-submit-python-files-on-EC2-cluster-tp20320.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to