Re: spark-submit question

2017-02-28 Thread Marcelo Vanzin
You're either running a really old version of Spark where there might
have been issues in that code, or you're actually missing some
backslashes in the command you pasted in your message.

On Tue, Feb 28, 2017 at 2:05 PM, Joe Olson <jo4...@outlook.com> wrote:
>> Everything after the jar path is passed to the main class as parameters.
>
> I don't think that is accurate if your application arguments contain double
> dashes. I've tried with several permutations of with and without '\'s and
> newlines.
>
> Just thought I'd ask here before I have to re-configure and re-compile all
> my jars.
>
> ./bin/spark-submit \
>   --class org.apache.spark.examples.SparkPi \
>   --master spark://207.184.161.138:7077 \
>   --deploy-mode cluster \
>   --supervise \
>   --executor-memory 20G \
>   --total-executor-cores 100 \
>   /path/to/examples.jar
>   --num-decimals=1000
>   --second-argument=Arg2
>
> {
>   "action" : "CreateSubmissionResponse",
>   "serverSparkVersion" : "2.1.0",
>   "submissionId" : "driver-20170228155848-0016",
>   "success" : true
> }
> ./test3.sh: line 15: --num-decimals=1000: command not found
> ./test3.sh: line 16: --second-argument=Arg2: command not found
>
>
> ________
> From: Marcelo Vanzin <van...@cloudera.com>
> Sent: Tuesday, February 28, 2017 12:17:49 PM
> To: Joe Olson
> Cc: user@spark.apache.org
> Subject: Re: spark-submit question
>
> Everything after the jar path is passed to the main class as
> parameters. So if it's not working you're probably doing something
> wrong in your code (that you haven't posted).
>
> On Tue, Feb 28, 2017 at 7:05 AM, Joe Olson <jo4...@outlook.com> wrote:
>> For spark-submit, I know I can submit application level command line
>> parameters to my .jar.
>>
>>
>> However, can I prefix them with switches? My command line params are
>> processed in my applications using JCommander. I've tried several
>> variations
>> of the below with no success.
>>
>>
>> An example of what I am trying to do is below in the --num-decimals
>> argument.
>>
>>
>> ./bin/spark-submit \
>>   --class org.apache.spark.examples.SparkPi \
>>   --master spark://207.184.161.138:7077 \
>>   --deploy-mode cluster \
>>   --supervise \
>>   --executor-memory 20G \
>>   --total-executor-cores 100 \
>>   /path/to/examples.jar \
>>   --num-decimals=1000 \
>>   --second-argument=Arg2
>>
>>
>
>
>
> --
> Marcelo



-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: spark-submit question

2017-02-28 Thread Joe Olson
> Everything after the jar path is passed to the main class as parameters.

I don't think that is accurate if your application arguments contain double 
dashes. I've tried with several permutations of with and without '\'s and 
newlines.

Just thought I'd ask here before I have to re-configure and re-compile all my 
jars.

./bin/spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master spark://207.184.161.138:7077 \
  --deploy-mode cluster \
  --supervise \
  --executor-memory 20G \
  --total-executor-cores 100 \
  /path/to/examples.jar
  --num-decimals=1000
  --second-argument=Arg2

{
  "action" : "CreateSubmissionResponse",
  "serverSparkVersion" : "2.1.0",
  "submissionId" : "driver-20170228155848-0016",
  "success" : true
}
./test3.sh: line 15: --num-decimals=1000: command not found
./test3.sh: line 16: --second-argument=Arg2: command not found




From: Marcelo Vanzin <van...@cloudera.com>
Sent: Tuesday, February 28, 2017 12:17:49 PM
To: Joe Olson
Cc: user@spark.apache.org
Subject: Re: spark-submit question

Everything after the jar path is passed to the main class as
parameters. So if it's not working you're probably doing something
wrong in your code (that you haven't posted).

On Tue, Feb 28, 2017 at 7:05 AM, Joe Olson <jo4...@outlook.com> wrote:
> For spark-submit, I know I can submit application level command line
> parameters to my .jar.
>
>
> However, can I prefix them with switches? My command line params are
> processed in my applications using JCommander. I've tried several variations
> of the below with no success.
>
>
> An example of what I am trying to do is below in the --num-decimals
> argument.
>
>
> ./bin/spark-submit \
>   --class org.apache.spark.examples.SparkPi \
>   --master spark://207.184.161.138:7077 \
>   --deploy-mode cluster \
>   --supervise \
>   --executor-memory 20G \
>   --total-executor-cores 100 \
>   /path/to/examples.jar \
>   --num-decimals=1000 \
>   --second-argument=Arg2
>
>



--
Marcelo


Re: spark-submit question

2017-02-28 Thread Marcelo Vanzin
Everything after the jar path is passed to the main class as
parameters. So if it's not working you're probably doing something
wrong in your code (that you haven't posted).

On Tue, Feb 28, 2017 at 7:05 AM, Joe Olson  wrote:
> For spark-submit, I know I can submit application level command line
> parameters to my .jar.
>
>
> However, can I prefix them with switches? My command line params are
> processed in my applications using JCommander. I've tried several variations
> of the below with no success.
>
>
> An example of what I am trying to do is below in the --num-decimals
> argument.
>
>
> ./bin/spark-submit \
>   --class org.apache.spark.examples.SparkPi \
>   --master spark://207.184.161.138:7077 \
>   --deploy-mode cluster \
>   --supervise \
>   --executor-memory 20G \
>   --total-executor-cores 100 \
>   /path/to/examples.jar \
>   --num-decimals=1000 \
>   --second-argument=Arg2
>
>



-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



spark-submit question

2017-02-28 Thread Joe Olson
For spark-submit, I know I can submit application level command line parameters 
to my .jar.


However, can I prefix them with switches? My command line params are processed 
in my applications using JCommander. I've tried several variations of the below 
with no success.


An example of what I am trying to do is below in the --num-decimals argument.


./bin/spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master spark://207.184.161.138:7077 \
  --deploy-mode cluster \
  --supervise \
  --executor-memory 20G \
  --total-executor-cores 100 \
  /path/to/examples.jar \
  --num-decimals=1000 \
  --second-argument=Arg2



Re: spark-submit question

2014-11-17 Thread Samarth Mailinglist
I figured it out. I had to use pyspark.files.SparkFiles to get the
locations of files loaded into Spark.


On Mon, Nov 17, 2014 at 1:26 PM, Sean Owen so...@cloudera.com wrote:

 You are changing these paths and filenames to match your own actual
 scripts and file locations right?
 On Nov 17, 2014 4:59 AM, Samarth Mailinglist 
 mailinglistsama...@gmail.com wrote:

 I am trying to run a job written in python with the following command:

 bin/spark-submit --master spark://localhost:7077 
 /path/spark_solution_basic.py --py-files /path/*.py --files 
 /path/config.properties

 I always get an exception that config.properties is not found:

 INFO - IOError: [Errno 2] No such file or directory: 'config.properties'

 Why isn't this working?
 ​




Re: spark-submit question

2014-11-16 Thread Sean Owen
You are changing these paths and filenames to match your own actual scripts
and file locations right?
On Nov 17, 2014 4:59 AM, Samarth Mailinglist mailinglistsama...@gmail.com
wrote:

 I am trying to run a job written in python with the following command:

 bin/spark-submit --master spark://localhost:7077 
 /path/spark_solution_basic.py --py-files /path/*.py --files 
 /path/config.properties

 I always get an exception that config.properties is not found:

 INFO - IOError: [Errno 2] No such file or directory: 'config.properties'

 Why isn't this working?
 ​