My guess, you need to check 2 things:

1) sparkTest1.jar contains tests.testFileReader
2) put sparkTest1.jar into the directory from where you are executing this
command, and then run this :

bin/spark-submit --verbose --class tests.testFileReader --master spark://
192.168.194.128:7077  sparkTest1.jar

This is another working command for running SparkPi example which comes
with Spark:

./bin/spark-submit --class org.apache.spark.examples.SparkPi     --master
yarn-cluster lib/spark-examples*.jar     10

Hope it helps.


On Fri, Jul 17, 2015 at 3:44 PM, Julien Beaudan <jbeau...@stottlerhenke.com>
wrote:

>  Oh, yeah of course. I'm writing from the command line (I haven't tried
> the SparkLauncher), using
>
> bin/spark-submit --class tests.testFileReader --master spark://
> 192.168.194.128:7077 --verbose ./sparkTest1.jar
>
> All that the testFileReader class does is create an RDD from a few text
> files - just a sanity check to make sure that my set up works.
>
> Julien
>
>
> On 07/17/2015 03:35 PM, Elkhan Dadashov wrote:
>
> Are you running it from command line (CLI) or through SparkLauncher ?
>
>  If you can share the command (./bin/spark-submit ...) or the code
> snippet you are running, then it can give some clue.
>
> On Fri, Jul 17, 2015 at 3:30 PM, Julien Beaudan <
> jbeau...@stottlerhenke.com> wrote:
>
>>  Hi Elkhan,
>>
>> I ran Spark with --verbose, but the output looked the same to me - what
>> should I be looking for? At the beginning, the system properties which are
>> set are:
>>
>> System properties:
>> SPARK_SUBMIT -> true
>> spark.app.name -> tests.testFileReader
>> spark.jars ->
>> file:/C:/Users/jbeaudan/Spark/spark-1.3.1-bin-hadoop2.4/sparkTest1.jar
>> spark.master -> spark://192.168.194.128:7077
>> Classpath elements:
>> file:/C:/Users/jbeaudan/Spark/spark-1.3.1-bin-hadoop2.4/sparkTest1.jar
>>
>> I'm not sure why, but the file paths here seem formatted correctly (it is
>> same from the command terminal and Cygwin), so the path must get edited
>> afterwards?
>>
>> Julien
>>
>>
>> On 07/17/2015 03:00 PM, Elkhan Dadashov wrote:
>>
>> Run Spark with --verbose flag, to see what it read for that path.
>>
>>  I guess in Windows if you are using backslash, you need 2 of them (\\),
>> or just use forward slashes everywhere.
>>
>> On Fri, Jul 17, 2015 at 2:40 PM, Julien Beaudan <
>> jbeau...@stottlerhenke.com> wrote:
>>
>>> Hi,
>>>
>>> I running a stand-alone cluster in Windows 7, and when I try to run any
>>> worker on the machine, I get the following error:
>>>
>>> 15/07/17 14:14:43 ERROR ExecutorRunner: Error running executor
>>> java.io.IOException: Cannot run program
>>> "C:\cygdrive\c\Users\jbeaudan\Spark\spark-1.3.1-bin-hadoop2.4/bin/compute-classpath.cmd"
>>> (in directory "."): CreateProcess error=2, The system cannot find the file
>>> specified
>>>         at java.lang.ProcessBuilder.start(Unknown Source)
>>>         at org.apache.spark.util.Utils$.executeCommand(Utils.scala:1067)
>>>         at
>>> org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:1084)
>>>         at
>>> org.apache.spark.deploy.worker.CommandUtils$.buildJavaOpts(CommandUtils.scala:112)
>>>         at
>>> org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
>>>         at
>>> org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:47)
>>>         at
>>> org.apache.spark.deploy.worker.ExecutorRunner.fetchAndRunExecutor(ExecutorRunner.scala:132)
>>>         at
>>> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:68)
>>> Caused by: java.io.IOException: CreateProcess error=2, The system cannot
>>> find the file specified
>>>         at java.lang.ProcessImpl.create(Native Method)
>>>         at java.lang.ProcessImpl.<init>(Unknown Source)
>>>         at java.lang.ProcessImpl.start(Unknown Source)
>>>         ... 8 more
>>>
>>>
>>> I'm pretty sure the problem is that Spark is looking for the following
>>> path, which mixes forward and back slashes:
>>>
>>>
>>> C:\cygdrive\c\Users\jbeaudan\Spark\spark-1.3.1-bin-hadoop2.4/bin/compute-classpath.cmd
>>>
>>> Is there anyway to fix this?
>>>
>>> (Also, I have also tried running this from a normal terminal, instead of
>>> from cygwin, and I get the same issue, except this time the path is:
>>> C:\Users\jbeaudan\Spark\spark-1.3.1-bin-hadoop2.4\bin../bin/compute-classpath.cmd
>>> )
>>>
>>> Thank you!
>>>
>>> Julien
>>>
>>>
>>>
>>>
>>
>>
>>  --
>>
>>  Best regards,
>> Elkhan Dadashov
>>
>>
>>
>
>
>  --
>
>  Best regards,
> Elkhan Dadashov
>
>
>


-- 

Best regards,
Elkhan Dadashov

Reply via email to