Thanks, Sree!

Are you able to run your applications using spark-submit? Even after we
were able to build successfully, we ran into problems with running the
spark-submit script. If everything worked correctly for you, we can hope
that things will be smoother when 1.4.0 is made generally available.

arun

On Thu, Apr 16, 2015 at 10:18 PM, Sree V <sree_at_ch...@yahoo.com> wrote:

> spark 'master' branch (i.e. v1.4.0) builds successfully on windows 8.1
> intel i7 64-bit with oracle jdk8_45.
> with maven opts without the flag "-XX:ReservedCodeCacheSize=1g".
> takes about 33 minutes.
>
> Thanking you.
>
> With Regards
> Sree
>
>
>
>
>   On Thursday, April 16, 2015 9:07 PM, Arun Lists <lists.a...@gmail.com>
> wrote:
>
>
> Here is what I got from the engineer who worked on building Spark and
> using it on Windows:
>
> 1)  Hadoop winutils.exe is needed on Windows, even for local files – and
> you have to set the Hadoop.home.dir in the spark-class2.cmd (for the two
> lines with $RUNNER near the end, by adding “-Dhadoop.home.dir=<dir>” file
> after downloading Hadoop binaries + winutils.
> 2)  Java/Spark cannot delete the spark temporary files and it throws an
> exception (program still works though).  Manual clean-up works just fine,
> and it is not a permissions issue as it has rights to create the file (I
> have also tried using my own directory rather than the default, same error).
> 3)  tried building Spark again, and have attached the log – I don’t get
> any errors, just warnings.  However when I try to use that JAR I just get
> the error message “Error: Could not find or load main class
> org.apache.spark.deploy.SparkSubmit”.
>
> On Thu, Apr 16, 2015 at 12:19 PM, Arun Lists <lists.a...@gmail.com> wrote:
>
> Thanks, Matei! We'll try that and let you know if it works. You are
> correct in inferring that some of the problems we had were with
> dependencies.
>
> We also had problems with the spark-submit scripts. I will get the details
> from the engineer who worked on the Windows builds and provide them to you.
>
> arun
>
>
> On Thu, Apr 16, 2015 at 10:44 AM, Matei Zaharia <matei.zaha...@gmail.com>
> wrote:
>
> You could build Spark with Scala 2.11 on Mac / Linux and transfer it over
> to Windows. AFAIK it should build on Windows too, the only problem is that
> Maven might take a long time to download dependencies. What errors are you
> seeing?
>
> Matei
>
> > On Apr 16, 2015, at 9:23 AM, Arun Lists <lists.a...@gmail.com> wrote:
> >
> > We run Spark on Mac and Linux but also need to run it on Windows 8.1
> and  Windows Server. We ran into problems with the Scala 2.10 binary bundle
> for Spark 1.3.0 but managed to get it working. However, on Mac/Linux, we
> are on Scala 2.11.6 (we built Spark from the sources). On Windows, however
> despite our best efforts we cannot get Spark 1.3.0 as built from sources
> working for Scala 2.11.6. Spark has too many moving parts and dependencies!
> >
> > When can we expect to see a binary bundle for Spark 1.3.0 that is built
> for Scala 2.11.6?  I read somewhere that the only reason that Spark 1.3.0
> is still built for Scala 2.10 is because Kafka is still on Scala 2.10. For
> those of us who don't use Kafka, can we have a Scala 2.10 bundle.
> >
> > If there isn't an official bundle arriving any time soon, can someone
> who has built it for Windows 8.1 successfully please share with the group?
> >
> > Thanks,
> > arun
> >
>
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>

Reply via email to