Re: Build spark source code with scala 2.11

2019-03-12 Thread Stephen Boesch
You might have better luck downloading the 2.4.X branch

Am Di., 12. März 2019 um 16:39 Uhr schrieb swastik mittal :

> Then are the mlib of spark compatible with scala 2.12? Or can I change the
> spark version from spark3.0 to 2.3 or 2.4 in local spark/master?
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Build spark source code with scala 2.11

2019-03-12 Thread swastik mittal
Then are the mlib of spark compatible with scala 2.12? Or can I change the
spark version from spark3.0 to 2.3 or 2.4 in local spark/master?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Build spark source code with scala 2.11

2019-03-12 Thread Stephen Boesch
I think scala 2.11 support was removed with the spark3.0/master

Am Di., 12. März 2019 um 16:26 Uhr schrieb swastik mittal :

> I am trying to build my spark using build/sbt package, after changing the
> scala versions to 2.11 in pom.xml because my applications jar files use
> scala 2.11. But building the spark code gives an error in sql  saying "A
> method with a varargs annotation produces a forwarder method with the same
> signature (exprs:
> Array[org.apache.spark.sql.Column])org.apache.spark.sql.Column as an
> existing method." in UserDefinedFunction.scala. I even tried building with
> using Dscala parameter to change the version of scala but it gives the same
> error. How do I change the spark and scala version and build the spark
> source code correctly? Any help is appreciated.
>
> Thanks
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Build spark source code with scala 2.11

2019-03-12 Thread swastik mittal
I am trying to build my spark using build/sbt package, after changing the
scala versions to 2.11 in pom.xml because my applications jar files use
scala 2.11. But building the spark code gives an error in sql  saying "A
method with a varargs annotation produces a forwarder method with the same
signature (exprs:
Array[org.apache.spark.sql.Column])org.apache.spark.sql.Column as an
existing method." in UserDefinedFunction.scala. I even tried building with
using Dscala parameter to change the version of scala but it gives the same
error. How do I change the spark and scala version and build the spark
source code correctly? Any help is appreciated.

Thanks



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: build spark source code

2017-11-22 Thread Jörn Franke
You can check if Apache Bigtop provided you something like this for Spark on 
Windows (well probably not based on sbt but mvn).

> On 23. Nov 2017, at 03:34, Michael Artz  wrote:
> 
> It would be nice if I could download the source code of spark from github, 
> then build it with sbt on my windows machine, and use IntelliJ to make little 
> modifications to the code base. I have installed spark before on windows 
> quite a few times, but I just use the packaged artifact.  Has anyone built 
> the source code on a windows machine before? 

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



build spark source code

2017-11-22 Thread Michael Artz
It would be nice if I could download the source code of spark from github,
then build it with sbt on my windows machine, and use IntelliJ to make
little modifications to the code base. I have installed spark before on
windows quite a few times, but I just use the packaged artifact.  Has
anyone built the source code on a windows machine before?


Re: How to deploy self-build spark source code on EC2

2015-04-28 Thread Nicholas Chammas
[-dev] [+user]

This is a question for the user list, not the dev list.

Use the --spark-version and --spark-git-repo options to specify your own
repo and hash to deploy.

Source code link.


Nick

On Tue, Apr 28, 2015 at 12:14 PM Bo Fu b...@uchicago.edu
 wrote:

Hi all,
>
> I have an issue. I added some timestamps in Spark source code and built it
> using:
>
> mvn package -DskipTests
>
> I checked the new version in my own computer and it works. However, when I
> ran spark on EC2, the spark code EC2 machines ran is the original version.
>
> Anyone knows how to deploy the changed spark source code into EC2?
> Thx a lot
>
>
> Bo Fu
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>  ​


Re: Build spark source code with Maven in Intellij Idea

2015-01-08 Thread Sean Owen
Popular topic in the last 48 hours! Just about 20 minutes ago I
collected some recent information on just this topic into a pull
request.

https://github.com/apache/spark/pull/3952

On Thu, Jan 8, 2015 at 2:24 PM, Todd  wrote:
> Hi,
> I have imported the Spark source code in Intellij Idea as a SBT project. I
> try to do maven install in Intellij Idea  by clicking Install in the Spark
> Project  Parent POM(root),but failed.
> I would ask which profiles should be checked. What I want to achieve is
> staring Spark in IDE and Hadoop 2.4 in local machine. At this point, I only
> care Hadoop2.4,not care hbase,hive...

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Build spark source code with Maven in Intellij Idea

2015-01-08 Thread Todd
Hi,
I have imported the Spark source code in Intellij Idea as a SBT project. I try 
to do maven install in Intellij Idea  by clicking Install in the Spark Project  
Parent POM(root),but failed.
I would ask which profiles should be checked. What I want to achieve is staring 
Spark in IDE and Hadoop 2.4 in local machine. At this point, I only care 
Hadoop2.4,not care hbase,hive...