155848-0016",
> "success" : true
> }
> ./test3.sh: line 15: --num-decimals=1000: command not found
> ./test3.sh: line 16: --second-argument=Arg2: command not found
>
>
>
> From: Marcelo Vanzin
> Sent: Tuesday, February 28, 2017 12:17:49 PM
> To: Joe Olson
um-decimals=1000: command not found
./test3.sh: line 16: --second-argument=Arg2: command not found
From: Marcelo Vanzin
Sent: Tuesday, February 28, 2017 12:17:49 PM
To: Joe Olson
Cc: user@spark.apache.org
Subject: Re: spark-submit question
Everything after the jar path is passed to
Everything after the jar path is passed to the main class as
parameters. So if it's not working you're probably doing something
wrong in your code (that you haven't posted).
On Tue, Feb 28, 2017 at 7:05 AM, Joe Olson wrote:
> For spark-submit, I know I can submit application level command line
>
For spark-submit, I know I can submit application level command line parameters
to my .jar.
However, can I prefix them with switches? My command line params are processed
in my applications using JCommander. I've tried several variations of the below
with no success.
An example of what I am
I figured it out. I had to use pyspark.files.SparkFiles to get the
locations of files loaded into Spark.
On Mon, Nov 17, 2014 at 1:26 PM, Sean Owen wrote:
> You are changing these paths and filenames to match your own actual
> scripts and file locations right?
> On Nov 17, 2014 4:59 AM, "Samart
You are changing these paths and filenames to match your own actual scripts
and file locations right?
On Nov 17, 2014 4:59 AM, "Samarth Mailinglist"
wrote:
> I am trying to run a job written in python with the following command:
>
> bin/spark-submit --master spark://localhost:7077
> /path/spark_
I am trying to run a job written in python with the following command:
bin/spark-submit --master spark://localhost:7077
/path/spark_solution_basic.py --py-files /path/*.py --files
/path/config.properties
I always get an exception that config.properties is not found:
INFO - IOError: [Errno 2] No