Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
Good news - and Java 8 as well. I saw Matei after his talk at Scala days
and he said he would look into a 2.11 default but it seems that is already
the plan. Scala 2.12 is getting closer as well.

On Mon, May 16, 2016 at 2:55 PM, Ted Yu  wrote:

> For 2.0, I believe that is the case.
>
> Jenkins jobs have been running against Scala 2.11:
>
> [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ 
> java8-tests_2.11 ---
>
>
> FYI
>
>
> On Mon, May 16, 2016 at 2:45 PM, Eric Richardson 
> wrote:
>
>> On Thu, May 12, 2016 at 9:23 PM, Luciano Resende 
>> wrote:
>>
>>> Spark has moved to build using Scala 2.11 by default in master/trunk.
>>>
>>
>> Does this mean that the pre-built binaries for download will also move to
>> 2.11 as well?
>>
>>
>>>
>>>
>>> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk
>>> and you might be missing some modules/profiles for your build. What command
>>> did you use to build ?
>>>
>>> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
>>> m.vijayaragh...@gmail.com> wrote:
>>>
 Hello All,

 I built Spark from the source code available at
 https://github.com/apache/spark/. Although I haven't specified the
 "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
 see that it ended up using Scala 2.11. Now, for my application sbt, what
 should be the spark version? I tried the following

 val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
 val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"

 and scalaVersion := "2.11.8"

 But this setting of spark version gives sbt error

 unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT

 I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
 Does this mean, the only option is to put all the required jars in the lib
 folder (unmanaged dependencies)?

 Regards,
 Raghava.

>>>
>>>
>>>
>>> --
>>> Luciano Resende
>>> http://twitter.com/lresende1975
>>> http://lresende.blogspot.com/
>>>
>>
>>
>


Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Ted Yu
For 2.0, I believe that is the case.

Jenkins jobs have been running against Scala 2.11:

[INFO] --- scala-maven-plugin:3.2.2:testCompile
(scala-test-compile-first) @ java8-tests_2.11 ---


FYI


On Mon, May 16, 2016 at 2:45 PM, Eric Richardson 
wrote:

> On Thu, May 12, 2016 at 9:23 PM, Luciano Resende 
> wrote:
>
>> Spark has moved to build using Scala 2.11 by default in master/trunk.
>>
>
> Does this mean that the pre-built binaries for download will also move to
> 2.11 as well?
>
>
>>
>>
>> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
>> you might be missing some modules/profiles for your build. What command did
>> you use to build ?
>>
>> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
>> m.vijayaragh...@gmail.com> wrote:
>>
>>> Hello All,
>>>
>>> I built Spark from the source code available at
>>> https://github.com/apache/spark/. Although I haven't specified the
>>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>>> should be the spark version? I tried the following
>>>
>>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>>
>>> and scalaVersion := "2.11.8"
>>>
>>> But this setting of spark version gives sbt error
>>>
>>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>>
>>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>>> Does this mean, the only option is to put all the required jars in the lib
>>> folder (unmanaged dependencies)?
>>>
>>> Regards,
>>> Raghava.
>>>
>>
>>
>>
>> --
>> Luciano Resende
>> http://twitter.com/lresende1975
>> http://lresende.blogspot.com/
>>
>
>


Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
On Thu, May 12, 2016 at 9:23 PM, Luciano Resende 
wrote:

> Spark has moved to build using Scala 2.11 by default in master/trunk.
>

Does this mean that the pre-built binaries for download will also move to
2.11 as well?


>
>
> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
> you might be missing some modules/profiles for your build. What command did
> you use to build ?
>
> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
> m.vijayaragh...@gmail.com> wrote:
>
>> Hello All,
>>
>> I built Spark from the source code available at
>> https://github.com/apache/spark/. Although I haven't specified the
>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>> should be the spark version? I tried the following
>>
>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>
>> and scalaVersion := "2.11.8"
>>
>> But this setting of spark version gives sbt error
>>
>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>
>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>> Does this mean, the only option is to put all the required jars in the lib
>> folder (unmanaged dependencies)?
>>
>> Regards,
>> Raghava.
>>
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>


Re: sbt for Spark build with Scala 2.11

2016-05-13 Thread Raghava Mutharaju
Thank you for the response.

I used the following command to build from source

build/mvn -Dhadoop.version=2.6.4 -Phadoop-2.6 -DskipTests clean package

Would this put in the required jars in .ivy2 during the build process? If
so, how can I make the spark distribution runnable, so that I can use it on
other machines as well (make-distribution.sh no longer exists in Spark root
folder)?

For compiling my application, I put in the following lines in the build.sbt

packAutoSettings
val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"

lazy val root = (project in file(".")).
  settings(
name := "sparkel",
version := "0.1.0",
scalaVersion := "2.11.8",
libraryDependencies += spark,
libraryDependencies += sparksql
  )


Regards,
Raghava.


On Fri, May 13, 2016 at 12:23 AM, Luciano Resende 
wrote:

> Spark has moved to build using Scala 2.11 by default in master/trunk.
>
> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
> you might be missing some modules/profiles for your build. What command did
> you use to build ?
>
> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
> m.vijayaragh...@gmail.com> wrote:
>
>> Hello All,
>>
>> I built Spark from the source code available at
>> https://github.com/apache/spark/. Although I haven't specified the
>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>> should be the spark version? I tried the following
>>
>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>
>> and scalaVersion := "2.11.8"
>>
>> But this setting of spark version gives sbt error
>>
>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>
>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>> Does this mean, the only option is to put all the required jars in the lib
>> folder (unmanaged dependencies)?
>>
>> Regards,
>> Raghava.
>>
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>



-- 
Regards,
Raghava
http://raghavam.github.io


Re: sbt for Spark build with Scala 2.11

2016-05-12 Thread Luciano Resende
Spark has moved to build using Scala 2.11 by default in master/trunk.

As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
you might be missing some modules/profiles for your build. What command did
you use to build ?

On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
m.vijayaragh...@gmail.com> wrote:

> Hello All,
>
> I built Spark from the source code available at
> https://github.com/apache/spark/. Although I haven't specified the
> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
> see that it ended up using Scala 2.11. Now, for my application sbt, what
> should be the spark version? I tried the following
>
> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>
> and scalaVersion := "2.11.8"
>
> But this setting of spark version gives sbt error
>
> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>
> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
> Does this mean, the only option is to put all the required jars in the lib
> folder (unmanaged dependencies)?
>
> Regards,
> Raghava.
>



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/