Cycling old bits:

http://search-hadoop.com/m/q3RTtHrxMj2abwOk2

On Fri, Feb 19, 2016 at 6:40 PM, Arko Provo Mukherjee <
arkoprovomukher...@gmail.com> wrote:

> Hi,
>
> Thanks for your response. Is there a similar link for Windows? I am
> not sure the .sh scripts would run on windows.
>
> My default the start-all.sh doesn't work and I don't see anything in
> localhos:8080
>
> I will do some more investigation and come back.
>
> Thanks again for all your help!
>
> Thanks & regards
> Arko
>
>
> On Fri, Feb 19, 2016 at 6:35 PM, Ted Yu <yuzhih...@gmail.com> wrote:
> > Please see https://spark.apache.org/docs/latest/spark-standalone.html
> >
> > On Fri, Feb 19, 2016 at 6:27 PM, Arko Provo Mukherjee
> > <arkoprovomukher...@gmail.com> wrote:
> >>
> >> Hi,
> >>
> >> Thanks for your response, that really helped.
> >>
> >> However, I don't believe the job is being submitted. When I run spark
> >> from the shell, I don't need to start it up explicitly. Do I need to
> >> start up Spark on my machine before running this program?
> >>
> >> I see the following in the SPARK_HOME\bin directory:
> >> Name
> >> ----
> >> beeline.cmd
> >> load-spark-env.cmd
> >> pyspark.cmd
> >> pyspark2.cmd
> >> run-example.cmd
> >> run-example2.cmd
> >> spark-class.cmd
> >> spark-class2.cmd
> >> spark-shell.cmd
> >> spark-shell2.cmd
> >> spark-submit.cmd
> >> spark-submit2.cmd
> >> sparkR.cmd
> >> sparkR2.cmd
> >>
> >> Do I need to run anyone of them before submitting the job via the
> program?
> >>
> >> Thanks & regards
> >> Arko
> >>
> >> On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau <hol...@pigscanfly.ca>
> >> wrote:
> >> > How are you trying to launch your application? Do you have the Spark
> >> > jars on
> >> > your class path?
> >> >
> >> >
> >> > On Friday, February 19, 2016, Arko Provo Mukherjee
> >> > <arkoprovomukher...@gmail.com> wrote:
> >> >>
> >> >> Hello,
> >> >>
> >> >> I am trying to submit a spark job via a program.
> >> >>
> >> >> When I run it, I receive the following error:
> >> >> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
> >> >> org/apache/spark/launcher/SparkLauncher
> >> >>         at Spark.SparkConnector.run(MySpark.scala:33)
> >> >>         at java.lang.Thread.run(Thread.java:745)
> >> >> Caused by: java.lang.ClassNotFoundException:
> >> >> org.apache.spark.launcher.SparkLauncher
> >> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >> >>         at java.security.AccessController.doPrivileged(Native Method)
> >> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >> >>         ... 2 more
> >> >>
> >> >> It seems it cannot find the SparkLauncher class. Any clue to what I
> am
> >> >> doing wrong?
> >> >>
> >> >> Thanks & regards
> >> >> Arko
> >> >>
> >> >> ---------------------------------------------------------------------
> >> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> >> For additional commands, e-mail: user-h...@spark.apache.org
> >> >>
> >> >
> >> >
> >> > --
> >> > Cell : 425-233-8271
> >> > Twitter: https://twitter.com/holdenkarau
> >> >
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: user-h...@spark.apache.org
> >>
> >
>

Reply via email to