Can you run the windows batch files (e.g. spark-submit.cmd) from the cygwin
shell?

On Tue, Jul 28, 2015 at 7:26 PM, Proust GZ Feng <pf...@cn.ibm.com> wrote:

> Hi, Owen
>
> Add back the cygwin classpath detection can pass the issue mentioned
> before, but there seems lack of further support in the launch lib, see
> below stacktrace
>
> LAUNCH_CLASSPATH:
> C:\spark-1.4.0-bin-hadoop2.3\lib\spark-assembly-1.4.0-hadoop2.3.0.jar
> java -cp
> *C:\spark-1.4.0-bin-hadoop2.3\lib\spark-assembly-1.4.0-hadoop2.3.0.jar*
> org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit
> --driver-class-path ../thirdparty/lib/db2-jdbc4-95fp6a/db2jcc4.jar
> --properties-file conf/spark.properties
> target/scala-2.10/price-scala-assembly-15.4.0-SNAPSHOT.jar
> Exception in thread "main" java.lang.IllegalStateException: Library
> directory '*C:\c\spark-1.4.0-bin-hadoop2.3\lib_managed\jars*' does not
> exist.
>         at
> org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:229)
>         at
> org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:215)
>         at
> org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:115)
>         at
> org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitCommand(SparkSubmitCommandBuilder.java:192)
>         at
> org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:117)
>         at org.apache.spark.launcher.Main.main(Main.java:74)
>
> Thanks
> Proust
>
>
>
>
> From:        Sean Owen <so...@cloudera.com>
> To:        Proust GZ Feng/China/IBM@IBMCN
> Cc:        user <user@spark.apache.org>
> Date:        07/28/2015 06:54 PM
> Subject:        Re: NO Cygwin Support in bin/spark-class in Spark 1.4.0
> ------------------------------
>
>
>
> Does adding back the cygwin detection and this clause make it work?
>
> if $cygwin; then
>  CLASSPATH="`cygpath -wp "$CLASSPATH"`"
> fi
>
> If so I imagine that's fine to bring back, if that's still needed.
>
> On Tue, Jul 28, 2015 at 9:49 AM, Proust GZ Feng <pf...@cn.ibm.com> wrote:
> > Thanks Owen, the problem under Cygwin is while run spark-submit under
> 1.4.0,
> > it simply report
> >
> > Error: Could not find or load main class org.apache.spark.launcher.Main
> >
> > This is because under Cygwin spark-class make the LAUNCH_CLASSPATH as
> > "/c/spark-1.4.0-bin-hadoop2.3/lib/spark-assembly-1.4.0-hadoop2.3.0.jar"
> > But under Cygwin java in Windows cannot recognize the classpath, so below
> > command simply error out
> >
> >  java -cp
> > /c/spark-1.4.0-bin-hadoop2.3/lib/spark-assembly-1.4.0-hadoop2.3.0.jar
> > org.apache.spark.launcher.Main
> > Error: Could not find or load main class org.apache.spark.launcher.Main
> >
> > Thanks
> > Proust
> >
> >
> >
> > From:        Sean Owen <so...@cloudera.com>
> > To:        Proust GZ Feng/China/IBM@IBMCN
> > Cc:        user <user@spark.apache.org>
> > Date:        07/28/2015 02:20 PM
> > Subject:        Re: NO Cygwin Support in bin/spark-class in Spark 1.4.0
> > ________________________________
> >
> >
> >
> > It wasn't removed, but rewritten. Cygwin is just a distribution of
> > POSIX-related utilities so you should be able to use the normal .sh
> > scripts. In any event, you didn't say what the problem is?
> >
> > On Tue, Jul 28, 2015 at 5:19 AM, Proust GZ Feng <pf...@cn.ibm.com>
> wrote:
> >> Hi, Spark Users
> >>
> >> Looks like Spark 1.4.0 cannot work with Cygwin due to the removing of
> >> Cygwin
> >> support in bin/spark-class
> >>
> >> The changeset is
> >>
> >>
> https://github.com/apache/spark/commit/517975d89d40a77c7186f488547eed11f79c1e97#diff-fdf4d3e600042c63ffa17b692c4372a3
> >>
> >> The changeset said "Add a library for launching Spark jobs
> >> programmatically", but how to use it in Cygwin?
> >> I'm wondering any solutions available to make it work in Windows?
> >>
> >>
> >> Thanks
> >> Proust
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>


-- 
Marcelo

Reply via email to