I figured it out. I had to use pyspark.files.SparkFiles to get the
locations of files loaded into Spark.


On Mon, Nov 17, 2014 at 1:26 PM, Sean Owen <so...@cloudera.com> wrote:

> You are changing these paths and filenames to match your own actual
> scripts and file locations right?
> On Nov 17, 2014 4:59 AM, "Samarth Mailinglist" <
> mailinglistsama...@gmail.com> wrote:
>
>> I am trying to run a job written in python with the following command:
>>
>> bin/spark-submit --master spark://localhost:7077 
>> /path/spark_solution_basic.py --py-files /path/*.py --files 
>> /path/config.properties
>>
>> I always get an exception that config.properties is not found:
>>
>> INFO - IOError: [Errno 2] No such file or directory: 'config.properties'
>>
>> Why isn't this working?
>> ​
>>
>

Reply via email to