Re: spark 2.4.3 build fails using java 8 and scala 2.11 with NumberFormatException: Not a version: 9

2019-05-19 Thread Bulldog20630405
after blowing away my m2 repo cache; i was able to build just fine... i
dont know why; but now it works :-)

On Sun, May 19, 2019 at 10:22 PM Bulldog20630405 
wrote:

> i am trying to build spark 2.4.3 with the following env:
>
>- fedora 29
>- 1.8.0_202
>- spark 2.4.3
>- scala 2.11.12
>- maven 3.5.4
>- hadoop 2.6.5
>
> according to the documentation this can be done with the following
> commands:
> *export TERM=xterm-color*
> *./build/mvn -Pyarn -DskipTests clean package*
>
> however i get the following error (it seems to me that somehow it think i
> am using java 9):
> (note: my real goals is to build spark for hadoop 3; however, i need to
> understand why the default build is failing first)
>
> *[ERROR] Failed to execute goal
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile *(scala-compile-first)*
> on project spark-tags_2.11*: Execution scala-compile-first of goal
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: CompileFailed
> -> [Help 1]
>
> [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
> spark-tags_2.11 ---
> [INFO] Using zinc server for incremental compilation
> [info] 'compiler-interface' not yet compiled for Scala 2.11.12.
> Compiling...
> *error: java.lang.NumberFormatException: Not a version: 9*
> at scala.util.PropertiesTrait$class.parts$1(Properties.scala:184)
> at scala.util.PropertiesTrait$class.isJavaAtLeast(Properties.scala:187)
> at scala.util.Properties$.isJavaAtLeast(Properties.scala:17)
> at
> scala.tools.util.PathResolverBase$Calculated$.javaBootClasspath(PathResolver.scala:276)
> at
> scala.tools.util.PathResolverBase$Calculated$.basis(PathResolver.scala:283)
> at
> scala.tools.util.PathResolverBase$Calculated$.containers$lzycompute(PathResolver.scala:293)
> at
> scala.tools.util.PathResolverBase$Calculated$.containers(PathResolver.scala:293)
> at scala.tools.util.PathResolverBase.containers(PathResolver.scala:309)
> at scala.tools.util.PathResolver.computeResult(PathResolver.scala:341)
> at scala.tools.util.PathResolver.computeResult(PathResolver.scala:332)
> at scala.tools.util.PathResolverBase.result(PathResolver.scala:314)
> at
> scala.tools.nsc.backend.JavaPlatform$class.classPath(JavaPlatform.scala:28)
> at scala.tools.nsc.Global$GlobalPlatform.classPath(Global.scala:115)
> at
> scala.tools.nsc.Global.scala$tools$nsc$Global$$recursiveClassPath(Global.scala:131)
> at scala.tools.nsc.Global$GlobalMirror.rootLoader(Global.scala:64)
> at scala.reflect.internal.Mirrors$Roots$RootClass.(Mirrors.scala:307)
> at
> scala.reflect.internal.Mirrors$Roots.RootClass$lzycompute(Mirrors.scala:321)
> at scala.reflect.internal.Mirrors$Roots.RootClass(Mirrors.scala:321)
> at
> scala.reflect.internal.Mirrors$Roots$EmptyPackageClass.(Mirrors.scala:330)
> at
> scala.reflect.internal.Mirrors$Roots.EmptyPackageClass$lzycompute(Mirrors.scala:336)
> at
> scala.reflect.internal.Mirrors$Roots.EmptyPackageClass(Mirrors.scala:336)
> at
> scala.reflect.internal.Mirrors$Roots.EmptyPackageClass(Mirrors.scala:276)
> at scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:250)
> at scala.tools.nsc.Global.rootMirror$lzycompute(Global.scala:73)
> at scala.tools.nsc.Global.rootMirror(Global.scala:71)
> at scala.tools.nsc.Global.rootMirror(Global.scala:39)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1390)
> at scala.tools.nsc.Global$Run.(Global.scala:1242)
> at scala.tools.nsc.Driver.doCompile(Driver.scala:31)
> at scala.tools.nsc.MainClass.doCompile(Main.scala:23)
> at scala.tools.nsc.Driver.process(Driver.scala:51)
> at scala.tools.nsc.Main.process(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at sbt.compiler.RawCompiler.apply(RawCompiler.scala:33)
> at
> sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:159)
> at
> sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:155)
> at sbt.IO$.withTemporaryDirectory(IO.scala:358)
> at
> sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:155)
> at
> sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:152)
> at sbt.IO$.withTemporaryDirectory(IO.scala:358)
> at
> sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:152)
> at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:58)
> at com.typesafe.zinc.Compiler$.compilerInterface(Compiler.scala:154)
> at com.typesafe.zinc.Compiler$.create(Compiler.scala:55)
> 

spark 2.4.3 build fails using java 8 and scala 2.11 with NumberFormatException: Not a version: 9

2019-05-19 Thread Bulldog20630405
i am trying to build spark 2.4.3 with the following env:

   - fedora 29
   - 1.8.0_202
   - spark 2.4.3
   - scala 2.11.12
   - maven 3.5.4
   - hadoop 2.6.5

according to the documentation this can be done with the following commands:
*export TERM=xterm-color*
*./build/mvn -Pyarn -DskipTests clean package*

however i get the following error (it seems to me that somehow it think i
am using java 9):
(note: my real goals is to build spark for hadoop 3; however, i need to
understand why the default build is failing first)

*[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile *(scala-compile-first)*
on project spark-tags_2.11*: Execution scala-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: CompileFailed
-> [Help 1]

[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
spark-tags_2.11 ---
[INFO] Using zinc server for incremental compilation
[info] 'compiler-interface' not yet compiled for Scala 2.11.12. Compiling...
*error: java.lang.NumberFormatException: Not a version: 9*
at scala.util.PropertiesTrait$class.parts$1(Properties.scala:184)
at scala.util.PropertiesTrait$class.isJavaAtLeast(Properties.scala:187)
at scala.util.Properties$.isJavaAtLeast(Properties.scala:17)
at
scala.tools.util.PathResolverBase$Calculated$.javaBootClasspath(PathResolver.scala:276)
at
scala.tools.util.PathResolverBase$Calculated$.basis(PathResolver.scala:283)
at
scala.tools.util.PathResolverBase$Calculated$.containers$lzycompute(PathResolver.scala:293)
at
scala.tools.util.PathResolverBase$Calculated$.containers(PathResolver.scala:293)
at scala.tools.util.PathResolverBase.containers(PathResolver.scala:309)
at scala.tools.util.PathResolver.computeResult(PathResolver.scala:341)
at scala.tools.util.PathResolver.computeResult(PathResolver.scala:332)
at scala.tools.util.PathResolverBase.result(PathResolver.scala:314)
at
scala.tools.nsc.backend.JavaPlatform$class.classPath(JavaPlatform.scala:28)
at scala.tools.nsc.Global$GlobalPlatform.classPath(Global.scala:115)
at
scala.tools.nsc.Global.scala$tools$nsc$Global$$recursiveClassPath(Global.scala:131)
at scala.tools.nsc.Global$GlobalMirror.rootLoader(Global.scala:64)
at scala.reflect.internal.Mirrors$Roots$RootClass.(Mirrors.scala:307)
at
scala.reflect.internal.Mirrors$Roots.RootClass$lzycompute(Mirrors.scala:321)
at scala.reflect.internal.Mirrors$Roots.RootClass(Mirrors.scala:321)
at
scala.reflect.internal.Mirrors$Roots$EmptyPackageClass.(Mirrors.scala:330)
at
scala.reflect.internal.Mirrors$Roots.EmptyPackageClass$lzycompute(Mirrors.scala:336)
at scala.reflect.internal.Mirrors$Roots.EmptyPackageClass(Mirrors.scala:336)
at scala.reflect.internal.Mirrors$Roots.EmptyPackageClass(Mirrors.scala:276)
at scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:250)
at scala.tools.nsc.Global.rootMirror$lzycompute(Global.scala:73)
at scala.tools.nsc.Global.rootMirror(Global.scala:71)
at scala.tools.nsc.Global.rootMirror(Global.scala:39)
at
scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
at
scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
at
scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1390)
at scala.tools.nsc.Global$Run.(Global.scala:1242)
at scala.tools.nsc.Driver.doCompile(Driver.scala:31)
at scala.tools.nsc.MainClass.doCompile(Main.scala:23)
at scala.tools.nsc.Driver.process(Driver.scala:51)
at scala.tools.nsc.Main.process(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sbt.compiler.RawCompiler.apply(RawCompiler.scala:33)
at
sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:159)
at
sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:155)
at sbt.IO$.withTemporaryDirectory(IO.scala:358)
at
sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:155)
at
sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:152)
at sbt.IO$.withTemporaryDirectory(IO.scala:358)
at
sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:152)
at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:58)
at com.typesafe.zinc.Compiler$.compilerInterface(Compiler.scala:154)
at com.typesafe.zinc.Compiler$.create(Compiler.scala:55)
at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:42)
at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:42)
at com.typesafe.zinc.Cache.get(Cache.scala:41)
at com.typesafe.zinc.Compiler$.apply(Compiler.scala:42)
at com.typesafe.zinc.Main$.run(Main.scala:96)
at com.typesafe.zinc.Nailgun$.zinc(Nailgun.scala:95)
at 

Re: Build spark source code with scala 2.11

2019-03-12 Thread Stephen Boesch
You might have better luck downloading the 2.4.X branch

Am Di., 12. März 2019 um 16:39 Uhr schrieb swastik mittal :

> Then are the mlib of spark compatible with scala 2.12? Or can I change the
> spark version from spark3.0 to 2.3 or 2.4 in local spark/master?
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Build spark source code with scala 2.11

2019-03-12 Thread swastik mittal
Then are the mlib of spark compatible with scala 2.12? Or can I change the
spark version from spark3.0 to 2.3 or 2.4 in local spark/master?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Build spark source code with scala 2.11

2019-03-12 Thread Stephen Boesch
I think scala 2.11 support was removed with the spark3.0/master

Am Di., 12. März 2019 um 16:26 Uhr schrieb swastik mittal :

> I am trying to build my spark using build/sbt package, after changing the
> scala versions to 2.11 in pom.xml because my applications jar files use
> scala 2.11. But building the spark code gives an error in sql  saying "A
> method with a varargs annotation produces a forwarder method with the same
> signature (exprs:
> Array[org.apache.spark.sql.Column])org.apache.spark.sql.Column as an
> existing method." in UserDefinedFunction.scala. I even tried building with
> using Dscala parameter to change the version of scala but it gives the same
> error. How do I change the spark and scala version and build the spark
> source code correctly? Any help is appreciated.
>
> Thanks
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Build spark source code with scala 2.11

2019-03-12 Thread swastik mittal
I am trying to build my spark using build/sbt package, after changing the
scala versions to 2.11 in pom.xml because my applications jar files use
scala 2.11. But building the spark code gives an error in sql  saying "A
method with a varargs annotation produces a forwarder method with the same
signature (exprs:
Array[org.apache.spark.sql.Column])org.apache.spark.sql.Column as an
existing method." in UserDefinedFunction.scala. I even tried building with
using Dscala parameter to change the version of scala but it gives the same
error. How do I change the spark and scala version and build the spark
source code correctly? Any help is appreciated.

Thanks



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark 2.0 Scala 2.11 and Kafka 0.10 Scala 2.10

2017-02-08 Thread Cody Koeninger
Pretty sure there was no 0.10.0.2 release of apache kafka.  If that's
a hortonworks modified version you may get better results asking in a
hortonworks specific forum.  Scala version of kafka shouldn't be
relevant either way though.

On Wed, Feb 8, 2017 at 5:30 PM, u...@moosheimer.com <u...@moosheimer.com> wrote:
> Dear devs,
>
> is it possible to use Spark 2.0.2 Scala 2.11 and consume messages from kafka
> server 0.10.0.2 running on Scala 2.10?
> I tried this the last two days by using createDirectStream and can't get no
> message out of kafka?!
>
> I'm using HDP 2.5.3 running kafka_2.10-0.10.0.2.5.3.0-37 and Spark 2.0.2.
>
> Uwe
>

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Spark 2.0 Scala 2.11 and Kafka 0.10 Scala 2.10

2017-02-08 Thread u...@moosheimer.com
Dear devs,

is it possible to use Spark 2.0.2 Scala 2.11 and consume messages from
kafka server 0.10.0.2 running on Scala 2.10?
I tried this the last two days by using createDirectStream and can't get
no message out of kafka?!

I'm using HDP 2.5.3 running kafka_2.10-0.10.0.2.5.3.0-37 and Spark 2.0.2.

Uwe



Re: Assembly for Kafka >= 0.10.0, Spark 2.2.0, Scala 2.11

2017-01-18 Thread Cody Koeninger
Spark 2.2 hasn't been released yet, has it?

Python support in kafka dstreams for 0.10 is probably never, there's a
jira ticket about this.

Stable, hard to say.  It was quite a few releases before 0.8 was
marked stable, even though it underwent little change.

On Wed, Jan 18, 2017 at 2:21 AM, Karamba <phantom...@web.de> wrote:
> |Hi, I am looking for an assembly for Spark 2.2.0 with Scala 2.11. I
> can't find one in MVN Repository. Moreover, "org.apache.spark" %%
> "spark-streaming-kafka-0-10_2.11" % "2.1.0 shows that even sbt does not
> find one: [error] (*:update) sbt.ResolveException: unresolved
> dependency: org.apache.spark#spark-streaming-kafka-0-10_2.11_2.11;2.1.0:
> not found Where do I find that a library? Thanks and best regards,
> karamba PS: Does anybody know when python support becomes available in
> spark-streaming-kafka-0-10 and when it will reach "stable"? |
>
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Assembly for Kafka >= 0.10.0, Spark 2.2.0, Scala 2.11

2017-01-18 Thread Karamba
|Hi, I am looking for an assembly for Spark 2.2.0 with Scala 2.11. I
can't find one in MVN Repository. Moreover, "org.apache.spark" %%
"spark-streaming-kafka-0-10_2.11" % "2.1.0 shows that even sbt does not
find one: [error] (*:update) sbt.ResolveException: unresolved
dependency: org.apache.spark#spark-streaming-kafka-0-10_2.11_2.11;2.1.0:
not found Where do I find that a library? Thanks and best regards,
karamba PS: Does anybody know when python support becomes available in
spark-streaming-kafka-0-10 and when it will reach "stable"? |


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Is there a way to run a jar built for scala 2.11 on spark 1.6.1 (which is using 2.10?)

2016-05-18 Thread Ted Yu
Depending on the version of hadoop you use, you may find tar ball prebuilt
with Scala 2.11:

https://s3.amazonaws.com/spark-related-packages

FYI

On Wed, May 18, 2016 at 3:35 PM, Koert Kuipers <ko...@tresata.com> wrote:

> no but you can trivially build spark 1.6.1 for scala 2.11
>
> On Wed, May 18, 2016 at 6:11 PM, Sergey Zelvenskiy <ser...@actions.im>
> wrote:
>
>>
>>
>


Re: Is there a way to run a jar built for scala 2.11 on spark 1.6.1 (which is using 2.10?)

2016-05-18 Thread Koert Kuipers
no but you can trivially build spark 1.6.1 for scala 2.11

On Wed, May 18, 2016 at 6:11 PM, Sergey Zelvenskiy <ser...@actions.im>
wrote:

>
>


Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
Good news - and Java 8 as well. I saw Matei after his talk at Scala days
and he said he would look into a 2.11 default but it seems that is already
the plan. Scala 2.12 is getting closer as well.

On Mon, May 16, 2016 at 2:55 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> For 2.0, I believe that is the case.
>
> Jenkins jobs have been running against Scala 2.11:
>
> [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ 
> java8-tests_2.11 ---
>
>
> FYI
>
>
> On Mon, May 16, 2016 at 2:45 PM, Eric Richardson <ekrichard...@gmail.com>
> wrote:
>
>> On Thu, May 12, 2016 at 9:23 PM, Luciano Resende <luckbr1...@gmail.com>
>> wrote:
>>
>>> Spark has moved to build using Scala 2.11 by default in master/trunk.
>>>
>>
>> Does this mean that the pre-built binaries for download will also move to
>> 2.11 as well?
>>
>>
>>>
>>>
>>> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk
>>> and you might be missing some modules/profiles for your build. What command
>>> did you use to build ?
>>>
>>> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
>>> m.vijayaragh...@gmail.com> wrote:
>>>
>>>> Hello All,
>>>>
>>>> I built Spark from the source code available at
>>>> https://github.com/apache/spark/. Although I haven't specified the
>>>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>>>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>>>> should be the spark version? I tried the following
>>>>
>>>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>>>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>>>
>>>> and scalaVersion := "2.11.8"
>>>>
>>>> But this setting of spark version gives sbt error
>>>>
>>>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>>>
>>>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>>>> Does this mean, the only option is to put all the required jars in the lib
>>>> folder (unmanaged dependencies)?
>>>>
>>>> Regards,
>>>> Raghava.
>>>>
>>>
>>>
>>>
>>> --
>>> Luciano Resende
>>> http://twitter.com/lresende1975
>>> http://lresende.blogspot.com/
>>>
>>
>>
>


Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Ted Yu
For 2.0, I believe that is the case.

Jenkins jobs have been running against Scala 2.11:

[INFO] --- scala-maven-plugin:3.2.2:testCompile
(scala-test-compile-first) @ java8-tests_2.11 ---


FYI


On Mon, May 16, 2016 at 2:45 PM, Eric Richardson <ekrichard...@gmail.com>
wrote:

> On Thu, May 12, 2016 at 9:23 PM, Luciano Resende <luckbr1...@gmail.com>
> wrote:
>
>> Spark has moved to build using Scala 2.11 by default in master/trunk.
>>
>
> Does this mean that the pre-built binaries for download will also move to
> 2.11 as well?
>
>
>>
>>
>> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
>> you might be missing some modules/profiles for your build. What command did
>> you use to build ?
>>
>> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
>> m.vijayaragh...@gmail.com> wrote:
>>
>>> Hello All,
>>>
>>> I built Spark from the source code available at
>>> https://github.com/apache/spark/. Although I haven't specified the
>>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>>> should be the spark version? I tried the following
>>>
>>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>>
>>> and scalaVersion := "2.11.8"
>>>
>>> But this setting of spark version gives sbt error
>>>
>>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>>
>>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>>> Does this mean, the only option is to put all the required jars in the lib
>>> folder (unmanaged dependencies)?
>>>
>>> Regards,
>>> Raghava.
>>>
>>
>>
>>
>> --
>> Luciano Resende
>> http://twitter.com/lresende1975
>> http://lresende.blogspot.com/
>>
>
>


Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
On Thu, May 12, 2016 at 9:23 PM, Luciano Resende <luckbr1...@gmail.com>
wrote:

> Spark has moved to build using Scala 2.11 by default in master/trunk.
>

Does this mean that the pre-built binaries for download will also move to
2.11 as well?


>
>
> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
> you might be missing some modules/profiles for your build. What command did
> you use to build ?
>
> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
> m.vijayaragh...@gmail.com> wrote:
>
>> Hello All,
>>
>> I built Spark from the source code available at
>> https://github.com/apache/spark/. Although I haven't specified the
>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>> should be the spark version? I tried the following
>>
>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>
>> and scalaVersion := "2.11.8"
>>
>> But this setting of spark version gives sbt error
>>
>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>
>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>> Does this mean, the only option is to put all the required jars in the lib
>> folder (unmanaged dependencies)?
>>
>> Regards,
>> Raghava.
>>
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>


Re: sbt for Spark build with Scala 2.11

2016-05-13 Thread Raghava Mutharaju
Thank you for the response.

I used the following command to build from source

build/mvn -Dhadoop.version=2.6.4 -Phadoop-2.6 -DskipTests clean package

Would this put in the required jars in .ivy2 during the build process? If
so, how can I make the spark distribution runnable, so that I can use it on
other machines as well (make-distribution.sh no longer exists in Spark root
folder)?

For compiling my application, I put in the following lines in the build.sbt

packAutoSettings
val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"

lazy val root = (project in file(".")).
  settings(
name := "sparkel",
version := "0.1.0",
scalaVersion := "2.11.8",
libraryDependencies += spark,
libraryDependencies += sparksql
  )


Regards,
Raghava.


On Fri, May 13, 2016 at 12:23 AM, Luciano Resende <luckbr1...@gmail.com>
wrote:

> Spark has moved to build using Scala 2.11 by default in master/trunk.
>
> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
> you might be missing some modules/profiles for your build. What command did
> you use to build ?
>
> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
> m.vijayaragh...@gmail.com> wrote:
>
>> Hello All,
>>
>> I built Spark from the source code available at
>> https://github.com/apache/spark/. Although I haven't specified the
>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>> should be the spark version? I tried the following
>>
>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>
>> and scalaVersion := "2.11.8"
>>
>> But this setting of spark version gives sbt error
>>
>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>
>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>> Does this mean, the only option is to put all the required jars in the lib
>> folder (unmanaged dependencies)?
>>
>> Regards,
>> Raghava.
>>
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>



-- 
Regards,
Raghava
http://raghavam.github.io


Re: sbt for Spark build with Scala 2.11

2016-05-12 Thread Luciano Resende
Spark has moved to build using Scala 2.11 by default in master/trunk.

As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
you might be missing some modules/profiles for your build. What command did
you use to build ?

On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
m.vijayaragh...@gmail.com> wrote:

> Hello All,
>
> I built Spark from the source code available at
> https://github.com/apache/spark/. Although I haven't specified the
> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
> see that it ended up using Scala 2.11. Now, for my application sbt, what
> should be the spark version? I tried the following
>
> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>
> and scalaVersion := "2.11.8"
>
> But this setting of spark version gives sbt error
>
> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>
> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
> Does this mean, the only option is to put all the required jars in the lib
> folder (unmanaged dependencies)?
>
> Regards,
> Raghava.
>



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


sbt for Spark build with Scala 2.11

2016-05-12 Thread Raghava Mutharaju
Hello All,

I built Spark from the source code available at
https://github.com/apache/spark/. Although I haven't specified the
"-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
see that it ended up using Scala 2.11. Now, for my application sbt, what
should be the spark version? I tried the following

val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"

and scalaVersion := "2.11.8"

But this setting of spark version gives sbt error

unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT

I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT. Does
this mean, the only option is to put all the required jars in the lib
folder (unmanaged dependencies)?

Regards,
Raghava.


spark w/ scala 2.11 and PackratParsers

2016-05-04 Thread matd
Hi folks,

Our project is a mess of scala 2.10 and 2.11, so I tried to switch
everything to 2.11.

I had some exasperating errors like this :

java.lang.NoClassDefFoundError:
org/apache/spark/sql/execution/datasources/DDLParser
at org.apache.spark.sql.SQLContext.(SQLContext.scala:208)
at org.apache.spark.sql.SQLContext.(SQLContext.scala:77)
at org.apache.spark.sql.SQLContext$.getOrCreate(SQLContext.scala:1295)

... that I was unable to fix, until I figured out that this error came first
:

java.lang.NoClassDefFoundError: scala/util/parsing/combinator/PackratParsers
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

...that finally managed to fix by adding this dependency :
"org.scala-lang.modules"  %% "scala-parser-combinators" % "1.0.4"

As this is not documented anywhere, I'd like to now if it's just a missing
doc somewhere, or if it's hiding another problem that will jump out at my
face at some point ?

Mathieu




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-w-scala-2-11-and-PackratParsers-tp26877.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Launching EC2 instances with Spark compiled for Scala 2.11

2016-01-25 Thread Darren Govoni


Why not deploy it. Then build a custom distribution with Scala 2.11 and just 
overlay it.


Sent from my Verizon Wireless 4G LTE smartphone

 Original message 
From: Nuno Santos <nfssan...@gmail.com> 
Date: 01/25/2016  7:38 AM  (GMT-05:00) 
To: user@spark.apache.org 
Subject: Re: Launching EC2 instances with Spark compiled for Scala 2.11 

Hello, 

Any updates on this question? I'm also very interested in a solution, as I'm
trying to use Spark on EC2 but need Scala 2.11 support. The scripts in the
ec2 directory of the Spark distribution install use Scala 2.10 by default
and I can't see any obvious option to change to Scala 2.11. 

Regards, 
Nuno



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Launching-EC2-instances-with-Spark-compiled-for-Scala-2-11-tp24979p26059.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Launching EC2 instances with Spark compiled for Scala 2.11

2016-01-25 Thread Nuno Santos
Hello, 

Any updates on this question? I'm also very interested in a solution, as I'm
trying to use Spark on EC2 but need Scala 2.11 support. The scripts in the
ec2 directory of the Spark distribution install use Scala 2.10 by default
and I can't see any obvious option to change to Scala 2.11. 

Regards, 
Nuno



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Launching-EC2-instances-with-Spark-compiled-for-Scala-2-11-tp24979p26059.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



"impossible to get artifacts " error when using sbt to build 1.6.0 for scala 2.11

2016-01-07 Thread Lin Zhao
I tried to build 1.6.0 for yarn and scala 2.11, but have an error. Any help is 
appreciated.


[warn] Strategy 'first' was applied to 2 files

[info] Assembly up to date: 
/Users/lin/git/spark/network/yarn/target/scala-2.11/spark-network-yarn-1.6.0-hadoop2.7.1.jar

java.lang.IllegalStateException: impossible to get artifacts when data has not 
been loaded. IvyNode = org.slf4j#slf4j-log4j12;1.7.6

at org.apache.ivy.core.resolve.IvyNode.getArtifacts(IvyNode.java:809)

at org.apache.ivy.core.resolve.IvyNode.getSelectedArtifacts(IvyNode.java:786)

at 
org.apache.ivy.core.report.ResolveReport.setDependencies(ResolveReport.java:235)

at org.apache.ivy.core.resolve.ResolveEngine.resolve(ResolveEngine.java:235)

at org.apache.ivy.Ivy.resolve(Ivy.java:517)

at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:266)

at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)

at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)

at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)

at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)

at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)

at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)

at sbt.IvySbt$$anon$4.call(Ivy.scala:64)

at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)

at 
xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)

at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)

at xsbt.boot.Using$.withResource(Using.scala:10)

at xsbt.boot.Using$.apply(Using.scala:9)

at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)

at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)

at xsbt.boot.Locks$.apply0(Locks.scala:31)

at xsbt.boot.Locks$.apply(Locks.scala:28)

at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)

at sbt.IvySbt.withIvy(Ivy.scala:123)

at sbt.IvySbt.withIvy(Ivy.scala:120)

at sbt.IvySbt$Module.withModule(Ivy.scala:151)

at sbt.IvyActions$.updateEither(IvyActions.scala:157)

at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1318)

Command I ran to build:

>git chekout v1.6.0
>./dev/change-scala-version.sh 2.11
>build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.7.1 -Dscala-2.11 
>-Phadoop-provided assembly



Re: Scala 2.11 and Akka 2.4.0

2015-12-07 Thread RodrigoB
Hi Manas,

Thanks for the reply. I've done that. The problem lies with Spark + akka
2.4.0 build. Seems the maven shader plugin is altering some class files and
breaking the Akka runtime.

Seems the Spark build on Scala 2.11 using SBT is broken. I'm getting build
errors using sbt due to the issues found in the below thread in July of this
year.
https://mail-archives.apache.org/mod_mbox/spark-dev/201507.mbox/%3CCA+3qhFSJGmZToGmBU1=ivy7kr6eb7k8t6dpz+ibkstihryw...@mail.gmail.com%3E

So I went back to maven and decided to risk building Spark on akka 2.3.11
and force the akka 2.4.0 jars onto the server's classpath. I find this a
temporary solution while I cannot have a proper akka 2.4.0 runable build.

If anyone has managed to get it working, please let me know.

tnks,
Rod



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Scala-2-11-and-Akka-2-4-0-tp25535p25618.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Scala 2.11 and Akka 2.4.0

2015-12-05 Thread manasdebashiskar
There are steps to build spark using scala 2.11 in the spark docs.
the first step is 
/dev/change-scala-version.sh 2.11 which changes the scala version to 2.11.

I have not tried compiling spark with akka 2.4.0.

..Manas



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Scala-2-11-and-Akka-2-4-0-tp25535p25587.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Iulian Dragoș
As a I mentioned on the akka mailing list, in case others are following
this thread: the issue isn't with dependencies. It's a bug in the
maven-shade-plugin. It breaks classfiles when creating the assembly jar (it
seems to do some constant propagation). `sbt assembly` doesn't suffer from
this issue, probably because it uses another library for jar merging.

iulian

On Tue, Dec 1, 2015 at 7:21 PM, Boavida, Rodrigo <rodrigo.boav...@aspect.com
> wrote:

> HI Jacek,
>
> Yes I was told that as well but no one gave me release schedules, and I
> have the immediate need to have Spark Applications communicating with Akka
> clusters based on latest version. I'm aware there is an ongoing effort to
> change to the low level netty implementation but AFAIK it's not available
> yet.
>
> Any suggestions are very welcomed.
>
> Tnks,
> Rod
>
> -Original Message-
> From: Jacek Laskowski [mailto:ja...@japila.pl]
> Sent: 01 December 2015 18:17
> To: Boavida, Rodrigo <rodrigo.boav...@aspect.com>
> Cc: user <user@spark.apache.org>
> Subject: Re: Scala 2.11 and Akka 2.4.0
>
> On Tue, Dec 1, 2015 at 2:32 PM, RodrigoB <rodrigo.boav...@aspect.com>
> wrote:
>
> > I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0.
>
> Why? AFAIK Spark's leaving Akka's boat and joins Netty's.
>
> Jacek
> This email (including any attachments) is proprietary to Aspect Software,
> Inc. and may contain information that is confidential. If you have received
> this message in error, please do not read, copy or forward this message.
> Please notify the sender immediately, delete it from your system and
> destroy any copies. You may not further disclose or distribute this email
> or its attachments.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Ted Yu
>From the dependency tree, akka 2.4.0 was in effect.

Maybe check the classpath of master to see if there is older version of
akka.

Cheers


Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Jacek Laskowski
On Tue, Dec 1, 2015 at 2:32 PM, RodrigoB <rodrigo.boav...@aspect.com> wrote:

> I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0.

Why? AFAIK Spark's leaving Akka's boat and joins Netty's.

Jacek

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Boavida, Rodrigo
HI Jacek,

Yes I was told that as well but no one gave me release schedules, and I have 
the immediate need to have Spark Applications communicating with Akka clusters 
based on latest version. I'm aware there is an ongoing effort to change to the 
low level netty implementation but AFAIK it's not available yet.

Any suggestions are very welcomed.

Tnks,
Rod

-Original Message-
From: Jacek Laskowski [mailto:ja...@japila.pl]
Sent: 01 December 2015 18:17
To: Boavida, Rodrigo <rodrigo.boav...@aspect.com>
Cc: user <user@spark.apache.org>
Subject: Re: Scala 2.11 and Akka 2.4.0

On Tue, Dec 1, 2015 at 2:32 PM, RodrigoB <rodrigo.boav...@aspect.com> wrote:

> I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0.

Why? AFAIK Spark's leaving Akka's boat and joins Netty's.

Jacek
This email (including any attachments) is proprietary to Aspect Software, Inc. 
and may contain information that is confidential. If you have received this 
message in error, please do not read, copy or forward this message. Please 
notify the sender immediately, delete it from your system and destroy any 
copies. You may not further disclose or distribute this email or its 
attachments.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Ted Yu
I don't see 2.4.0 release under:
http://mvnrepository.com/artifact/com.typesafe.akka/akka-remote_2.10

Probably that was the cause for the 'Could not find artifact' error.

On Tue, Dec 1, 2015 at 7:03 AM, Boavida, Rodrigo <rodrigo.boav...@aspect.com
> wrote:

> Hi Ted,
>
> Thanks for getting back to me and for the suggestion.
>
> Running a 'mvn dependency:tree' I get the following:
>
> [ERROR] Failed to execute goal on project spark-core_2.11: Could not
> resolve dependencies for project
> org.apache.spark:spark-core_2.11:jar:1.5.2: The following artifacts could
> not be resolved: com.typesafe.akka:akka-remote_2.10:jar:2.4.0,
> com.typesafe.akka:akka-slf4j_2.10:jar:2.4.0,
> com.typesafe.akka:akka-testkit_2.10:jar:2.4.0: Could not find artifact
> com.typesafe.akka:akka-remote_2.10:jar:2.4.0 in central (
> https://repo1.maven.org/maven2) -> [Help 1]
>
> So it seems somehow it's still pulling some 2.10 dependencies. Do you
> think this could be the cause for the observed problem?
>
> tnks,
> Rod
>
> -Original Message-
> From: Ted Yu [mailto:yuzhih...@gmail.com]
> Sent: 01 December 2015 14:13
> To: Boavida, Rodrigo <rodrigo.boav...@aspect.com>
> Cc: user@spark.apache.org
> Subject: Re: Scala 2.11 and Akka 2.4.0
>
> Have you run 'mvn dependency:tree' and examined the output ?
>
> There should be some hint.
>
> Cheers
>
> > On Dec 1, 2015, at 5:32 AM, RodrigoB <rodrigo.boav...@aspect.com> wrote:
> >
> > Hi,
> >
> > I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0.
> > I've changed the main pom.xml files to corresponding akka version and
> > am getting the following exception when starting the master on
> standalone:
> >
> > Exception Details:
> >  Location:
> >akka/dispatch/Mailbox.processAllSystemMessages()V @152: getstatic
> >  Reason:
> >Type top (current frame, locals[9]) is not assignable to
> > 'akka/dispatch/sysmsg/SystemMessage' (stack map, locals[9])  Current
> > Frame:
> >bci: @131
> >flags: { }
> >locals: { 'akka/dispatch/Mailbox',
> > 'java/lang/InterruptedException',
> > 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox',
> 'java/lang/Throwable', 'java/lang/Throwable' }
> >stack: { integer }
> >  Stackmap Frame:
> >bci: @152
> >flags: { }
> >locals: { 'akka/dispatch/Mailbox',
> > 'java/lang/InterruptedException',
> > 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox',
> > 'java/lang/Throwable', 'java/lang/Throwable', top, top,
> 'akka/dispatch/sysmsg/SystemMessage' }
> >stack: { }
> >  Bytecode:
> >0x000: 014c 2ab2 0132 b601 35b6 0139 4db2 013e
> >0x010: 2cb6 0142 9900 522a b600 c69a 004b 2c4e
> >0x020: b201 3e2c b601 454d 2db9 0148 0100 2ab6
> >0x030: 0052 2db6 014b b801 0999 000e bb00 e759
> >0x040: 1301 4db7 010f 4cb2 013e 2cb6 0150 99ff
> >0x050: bf2a b600 c69a ffb8 2ab2 0132 b601 35b6
> >0x060: 0139 4da7 ffaa 2ab6 0052 b600 56b6 0154
> >0x070: b601 5a3a 04a7 0091 3a05 1905 3a06 1906
> >0x080: c100 e799 0015 1906 c000 e73a 0719 074c
> >0x090: b200 f63a 08a7 0071 b201 5f19 06b6 0163
> >0x0a0: 3a0a 190a b601 6899 0006 1905 bf19 0ab6
> >0x0b0: 016c c000 df3a 0b2a b600 52b6 0170 b601
> >0x0c0: 76bb 000f 5919 0b2a b600 52b6 017a b601
> >0x0d0: 80b6 0186 2ab6 018a bb01 8c59 b701 8e13
> >0x0e0: 0190 b601 9419 09b6 0194 1301 96b6 0194
> >0x0f0: 190b b601 99b6 0194 b601 9ab7 019d b601
> >0x100: a3b2 00f6 3a08 b201 3e2c b601 4299 0026
> >0x110: 2c3a 09b2 013e 2cb6 0145 4d19 09b9 0148
> >0x120: 0100 1904 2ab6 0052 b601 7a19 09b6 01a7
> >0x130: a7ff d62b c600 09b8 0109 572b bfb1  Exception Handler
> > Table:
> >bci [290, 307] => handler: 120
> >  Stackmap Table:
> >append_frame(@13,Object[#231],Object[#177])
> >append_frame(@71,Object[#177])
> >chop_frame(@102,1)
> >
> > full_frame(@120,{Object[#2],Object[#231],Object[#177],Top,Object[#2],O
> > bject[#177]},{Object[#223]})
> >
> >
> full_frame(@152,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#223],Object[#223],Top,Top,Object[#177]},{})
> >append_frame(@173,Object[#357])
> >
> > full_frame(@262,{Object[#2],Object[#231],Object[#177],Top,Object[#2]},{})
> >same_frame(@307)
> >same_frame(@317)
> >   at akka.dispatch.Mailboxes.(Mailboxes.scala:33)
> >at akka.actor.ActorSystemImpl.(ActorSystem.scala:63

Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Ted Yu
Please specify the following in your maven commands:
-Dscala-2.11

Cheers


RE: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Boavida, Rodrigo
Thanks that worked! I let you know the results.

Tnks,
Rod

From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: 01 December 2015 15:36
To: Boavida, Rodrigo <rodrigo.boav...@aspect.com>
Cc: user@spark.apache.org
Subject: Re: Scala 2.11 and Akka 2.4.0

Please specify the following in your maven commands:
-Dscala-2.11

Cheers
This email (including any attachments) is proprietary to Aspect Software, Inc. 
and may contain information that is confidential. If you have received this 
message in error, please do not read, copy or forward this message. Please 
notify the sender immediately, delete it from your system and destroy any 
copies. You may not further disclose or distribute this email or its 
attachments.


RE: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Boavida, Rodrigo
Hi Ted,

Thanks for getting back to me and for the suggestion.

Running a 'mvn dependency:tree' I get the following:

[ERROR] Failed to execute goal on project spark-core_2.11: Could not resolve 
dependencies for project org.apache.spark:spark-core_2.11:jar:1.5.2: The 
following artifacts could not be resolved: 
com.typesafe.akka:akka-remote_2.10:jar:2.4.0, 
com.typesafe.akka:akka-slf4j_2.10:jar:2.4.0, 
com.typesafe.akka:akka-testkit_2.10:jar:2.4.0: Could not find artifact 
com.typesafe.akka:akka-remote_2.10:jar:2.4.0 in central 
(https://repo1.maven.org/maven2) -> [Help 1]

So it seems somehow it's still pulling some 2.10 dependencies. Do you think 
this could be the cause for the observed problem?

tnks,
Rod

-Original Message-
From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: 01 December 2015 14:13
To: Boavida, Rodrigo <rodrigo.boav...@aspect.com>
Cc: user@spark.apache.org
Subject: Re: Scala 2.11 and Akka 2.4.0

Have you run 'mvn dependency:tree' and examined the output ?

There should be some hint.

Cheers

> On Dec 1, 2015, at 5:32 AM, RodrigoB <rodrigo.boav...@aspect.com> wrote:
>
> Hi,
>
> I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0.
> I've changed the main pom.xml files to corresponding akka version and
> am getting the following exception when starting the master on standalone:
>
> Exception Details:
>  Location:
>akka/dispatch/Mailbox.processAllSystemMessages()V @152: getstatic
>  Reason:
>Type top (current frame, locals[9]) is not assignable to
> 'akka/dispatch/sysmsg/SystemMessage' (stack map, locals[9])  Current
> Frame:
>bci: @131
>flags: { }
>locals: { 'akka/dispatch/Mailbox',
> 'java/lang/InterruptedException',
> 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox', 
> 'java/lang/Throwable', 'java/lang/Throwable' }
>stack: { integer }
>  Stackmap Frame:
>bci: @152
>flags: { }
>locals: { 'akka/dispatch/Mailbox',
> 'java/lang/InterruptedException',
> 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox',
> 'java/lang/Throwable', 'java/lang/Throwable', top, top, 
> 'akka/dispatch/sysmsg/SystemMessage' }
>stack: { }
>  Bytecode:
>0x000: 014c 2ab2 0132 b601 35b6 0139 4db2 013e
>0x010: 2cb6 0142 9900 522a b600 c69a 004b 2c4e
>0x020: b201 3e2c b601 454d 2db9 0148 0100 2ab6
>0x030: 0052 2db6 014b b801 0999 000e bb00 e759
>0x040: 1301 4db7 010f 4cb2 013e 2cb6 0150 99ff
>0x050: bf2a b600 c69a ffb8 2ab2 0132 b601 35b6
>0x060: 0139 4da7 ffaa 2ab6 0052 b600 56b6 0154
>0x070: b601 5a3a 04a7 0091 3a05 1905 3a06 1906
>0x080: c100 e799 0015 1906 c000 e73a 0719 074c
>0x090: b200 f63a 08a7 0071 b201 5f19 06b6 0163
>0x0a0: 3a0a 190a b601 6899 0006 1905 bf19 0ab6
>0x0b0: 016c c000 df3a 0b2a b600 52b6 0170 b601
>0x0c0: 76bb 000f 5919 0b2a b600 52b6 017a b601
>0x0d0: 80b6 0186 2ab6 018a bb01 8c59 b701 8e13
>0x0e0: 0190 b601 9419 09b6 0194 1301 96b6 0194
>0x0f0: 190b b601 99b6 0194 b601 9ab7 019d b601
>0x100: a3b2 00f6 3a08 b201 3e2c b601 4299 0026
>0x110: 2c3a 09b2 013e 2cb6 0145 4d19 09b9 0148
>0x120: 0100 1904 2ab6 0052 b601 7a19 09b6 01a7
>0x130: a7ff d62b c600 09b8 0109 572b bfb1  Exception Handler
> Table:
>bci [290, 307] => handler: 120
>  Stackmap Table:
>append_frame(@13,Object[#231],Object[#177])
>append_frame(@71,Object[#177])
>chop_frame(@102,1)
>
> full_frame(@120,{Object[#2],Object[#231],Object[#177],Top,Object[#2],O
> bject[#177]},{Object[#223]})
>
> full_frame(@152,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#223],Object[#223],Top,Top,Object[#177]},{})
>append_frame(@173,Object[#357])
>
> full_frame(@262,{Object[#2],Object[#231],Object[#177],Top,Object[#2]},{})
>same_frame(@307)
>same_frame(@317)
>   at akka.dispatch.Mailboxes.(Mailboxes.scala:33)
>at akka.actor.ActorSystemImpl.(ActorSystem.scala:635)
>at akka.actor.ActorSystem$.apply(ActorSystem.scala:143)
>at akka.actor.ActorSystem$.apply(ActorSystem.scala:120)
>at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>at
> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>at
> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
>at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1920)
>at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:166)
>at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1911)
>at
> org.apache.spark.util.AkkaUtils$.createActorSy

Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Ted Yu
Have you run 'mvn dependency:tree' and examined the output ?

There should be some hint. 

Cheers

> On Dec 1, 2015, at 5:32 AM, RodrigoB <rodrigo.boav...@aspect.com> wrote:
> 
> Hi,
> 
> I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0.
> I've changed the main pom.xml files to corresponding akka version and am
> getting the following exception when starting the master on standalone:
> 
> Exception Details:
>  Location:
>akka/dispatch/Mailbox.processAllSystemMessages()V @152: getstatic
>  Reason:
>Type top (current frame, locals[9]) is not assignable to
> 'akka/dispatch/sysmsg/SystemMessage' (stack map, locals[9])
>  Current Frame:
>bci: @131
>flags: { }
>locals: { 'akka/dispatch/Mailbox', 'java/lang/InterruptedException',
> 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox',
> 'java/lang/Throwable', 'java/lang/Throwable' }
>stack: { integer }
>  Stackmap Frame:
>bci: @152
>flags: { }
>locals: { 'akka/dispatch/Mailbox', 'java/lang/InterruptedException',
> 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox',
> 'java/lang/Throwable', 'java/lang/Throwable', top, top,
> 'akka/dispatch/sysmsg/SystemMessage' }
>stack: { }
>  Bytecode:
>0x000: 014c 2ab2 0132 b601 35b6 0139 4db2 013e
>0x010: 2cb6 0142 9900 522a b600 c69a 004b 2c4e
>0x020: b201 3e2c b601 454d 2db9 0148 0100 2ab6
>0x030: 0052 2db6 014b b801 0999 000e bb00 e759
>0x040: 1301 4db7 010f 4cb2 013e 2cb6 0150 99ff
>0x050: bf2a b600 c69a ffb8 2ab2 0132 b601 35b6
>0x060: 0139 4da7 ffaa 2ab6 0052 b600 56b6 0154
>0x070: b601 5a3a 04a7 0091 3a05 1905 3a06 1906
>0x080: c100 e799 0015 1906 c000 e73a 0719 074c
>0x090: b200 f63a 08a7 0071 b201 5f19 06b6 0163
>0x0a0: 3a0a 190a b601 6899 0006 1905 bf19 0ab6
>0x0b0: 016c c000 df3a 0b2a b600 52b6 0170 b601
>0x0c0: 76bb 000f 5919 0b2a b600 52b6 017a b601
>0x0d0: 80b6 0186 2ab6 018a bb01 8c59 b701 8e13
>0x0e0: 0190 b601 9419 09b6 0194 1301 96b6 0194
>0x0f0: 190b b601 99b6 0194 b601 9ab7 019d b601
>0x100: a3b2 00f6 3a08 b201 3e2c b601 4299 0026
>0x110: 2c3a 09b2 013e 2cb6 0145 4d19 09b9 0148
>0x120: 0100 1904 2ab6 0052 b601 7a19 09b6 01a7
>0x130: a7ff d62b c600 09b8 0109 572b bfb1
>  Exception Handler Table:
>bci [290, 307] => handler: 120
>  Stackmap Table:
>append_frame(@13,Object[#231],Object[#177])
>append_frame(@71,Object[#177])
>chop_frame(@102,1)
> 
> full_frame(@120,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#177]},{Object[#223]})
> 
> full_frame(@152,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#223],Object[#223],Top,Top,Object[#177]},{})
>append_frame(@173,Object[#357])
> 
> full_frame(@262,{Object[#2],Object[#231],Object[#177],Top,Object[#2]},{})
>same_frame(@307)
>same_frame(@317)
>   at akka.dispatch.Mailboxes.(Mailboxes.scala:33)
>at akka.actor.ActorSystemImpl.(ActorSystem.scala:635)
>at akka.actor.ActorSystem$.apply(ActorSystem.scala:143)
>at akka.actor.ActorSystem$.apply(ActorSystem.scala:120)
>at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>at
> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>at
> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
>at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1920)
>at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:166)
>at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1911)
>at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
>at
> org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:253)
>at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:53)
>at
> org.apache.spark.deploy.master.Master$.startRpcEnvAndEndpoint(Master.scala:1074)
>at org.apache.spark.deploy.master.Master$.main(Master.scala:1058)
>at org.apache.spark.deploy.master.Master.main(Master.scala)
> 
> ---
> 
> Has anyone encountered this problem before? Seems to be related with a
> version mismatch at some level with the Akka mailbox. I would very much
> appreciate any comments.
> 
> tnks,
> Rod
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Scala-2-11-and-Akka-2-4-0-tp25535.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> --

Re: Error Compiling Spark 1.4.1 w/ Scala 2.11 & Hive Support

2015-10-26 Thread Bryan Jeffrey
All,

The error resolved to a bad version of jline pulling from Maven.  The jline
version is defined as 'scala.version' -- the 2.11 version does not exist in
maven.  Instead the following should be used:

 
org.scala-lang
jline
2.11.0-M3
  

Regards,

Bryan Jeffrey

On Mon, Oct 26, 2015 at 9:01 AM, Bryan Jeffrey <bryan.jeff...@gmail.com>
wrote:

> All,
>
> I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive
> support. Any ideas?
>
> mvn -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
> -Phive-thriftserver package
>
> [INFO] Spark Project Parent POM .. SUCCESS [4.124s]
> [INFO] Spark Launcher Project  SUCCESS [9.001s]
> [INFO] Spark Project Networking .. SUCCESS [7.871s]
> [INFO] Spark Project Shuffle Streaming Service ... SUCCESS [3.904s]
> [INFO] Spark Project Unsafe .. SUCCESS [3.095s]
> [INFO] Spark Project Core  SUCCESS
> [24.768s]
> [INFO] Spark Project Bagel ... SUCCESS [2.029s]
> [INFO] Spark Project GraphX .. SUCCESS [4.057s]
> [INFO] Spark Project Streaming ... SUCCESS [9.774s]
> [INFO] Spark Project Catalyst  SUCCESS [6.804s]
> [INFO] Spark Project SQL . SUCCESS [9.606s]
> [INFO] Spark Project ML Library .. SUCCESS
> [10.872s]
> [INFO] Spark Project Tools ... SUCCESS [0.627s]
> [INFO] Spark Project Hive  SUCCESS
> [13.463s]
> [INFO] Spark Project REPL  SUCCESS [1.414s]
> [INFO] Spark Project YARN  SUCCESS [2.433s]
> [INFO] Spark Project Hive Thrift Server .. FAILURE [8.097s]
>
>
> [ERROR]
> /spark/spark-1.4.1.hive.bak/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:25:
> object ConsoleReader is not a member of package jline
> [ERROR] import jline.{ConsoleReader, History}
> [ERROR]^
> [WARNING] Class jline.Completor not found - continuing with a stub.
> [WARNING] Class jline.ConsoleReader not found - continuing with a stub.
> [ERROR]
> /spark/spark-1.4.1.hive.bak/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:171:
> not found: type ConsoleReader
> [ERROR] val reader = new ConsoleReader()
> [ERROR]  ^
> [ERROR] Class jline.Completor not found - continuing with a stub.
>


Re: Error Compiling Spark 1.4.1 w/ Scala 2.11 & Hive Support

2015-10-26 Thread Sean Owen
Did you switch the build to Scala 2.11 by running the script in dev/? It
won't work otherwise, but does work if you do. @Ted 2.11 was supported in
1.4, not just 1.5.

On Mon, Oct 26, 2015 at 2:13 PM, Bryan Jeffrey <bryan.jeff...@gmail.com>
wrote:

> All,
>
> The error resolved to a bad version of jline pulling from Maven.  The
> jline version is defined as 'scala.version' -- the 2.11 version does not
> exist in maven.  Instead the following should be used:
>
>  
> org.scala-lang
> jline
> 2.11.0-M3
>   
>
> Regards,
>
> Bryan Jeffrey
>
> On Mon, Oct 26, 2015 at 9:01 AM, Bryan Jeffrey <bryan.jeff...@gmail.com>
> wrote:
>
>> All,
>>
>> I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive
>> support. Any ideas?
>>
>> mvn -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
>> -Phive-thriftserver package
>>
>> [INFO] Spark Project Parent POM .. SUCCESS
>> [4.124s]
>> [INFO] Spark Launcher Project  SUCCESS
>> [9.001s]
>> [INFO] Spark Project Networking .. SUCCESS
>> [7.871s]
>> [INFO] Spark Project Shuffle Streaming Service ... SUCCESS
>> [3.904s]
>> [INFO] Spark Project Unsafe .. SUCCESS
>> [3.095s]
>> [INFO] Spark Project Core  SUCCESS
>> [24.768s]
>> [INFO] Spark Project Bagel ... SUCCESS
>> [2.029s]
>> [INFO] Spark Project GraphX .. SUCCESS
>> [4.057s]
>> [INFO] Spark Project Streaming ... SUCCESS
>> [9.774s]
>> [INFO] Spark Project Catalyst  SUCCESS
>> [6.804s]
>> [INFO] Spark Project SQL . SUCCESS
>> [9.606s]
>> [INFO] Spark Project ML Library .. SUCCESS
>> [10.872s]
>> [INFO] Spark Project Tools ... SUCCESS
>> [0.627s]
>> [INFO] Spark Project Hive  SUCCESS
>> [13.463s]
>> [INFO] Spark Project REPL  SUCCESS
>> [1.414s]
>> [INFO] Spark Project YARN  SUCCESS
>> [2.433s]
>> [INFO] Spark Project Hive Thrift Server .. FAILURE
>> [8.097s]
>>
>>
>> [ERROR]
>> /spark/spark-1.4.1.hive.bak/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:25:
>> object ConsoleReader is not a member of package jline
>> [ERROR] import jline.{ConsoleReader, History}
>> [ERROR]^
>> [WARNING] Class jline.Completor not found - continuing with a stub.
>> [WARNING] Class jline.ConsoleReader not found - continuing with a stub.
>> [ERROR]
>> /spark/spark-1.4.1.hive.bak/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:171:
>> not found: type ConsoleReader
>> [ERROR] val reader = new ConsoleReader()
>> [ERROR]  ^
>> [ERROR] Class jline.Completor not found - continuing with a stub.
>>
>
>


Error Compiling Spark 1.4.1 w/ Scala 2.11 & Hive Support

2015-10-26 Thread Bryan Jeffrey
All,

I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive
support. Any ideas?

mvn -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
-Phive-thriftserver package

[INFO] Spark Project Parent POM .. SUCCESS [4.124s]
[INFO] Spark Launcher Project  SUCCESS [9.001s]
[INFO] Spark Project Networking .. SUCCESS [7.871s]
[INFO] Spark Project Shuffle Streaming Service ... SUCCESS [3.904s]
[INFO] Spark Project Unsafe .. SUCCESS [3.095s]
[INFO] Spark Project Core  SUCCESS [24.768s]
[INFO] Spark Project Bagel ... SUCCESS [2.029s]
[INFO] Spark Project GraphX .. SUCCESS [4.057s]
[INFO] Spark Project Streaming ... SUCCESS [9.774s]
[INFO] Spark Project Catalyst  SUCCESS [6.804s]
[INFO] Spark Project SQL . SUCCESS [9.606s]
[INFO] Spark Project ML Library .. SUCCESS [10.872s]
[INFO] Spark Project Tools ... SUCCESS [0.627s]
[INFO] Spark Project Hive  SUCCESS [13.463s]
[INFO] Spark Project REPL  SUCCESS [1.414s]
[INFO] Spark Project YARN  SUCCESS [2.433s]
[INFO] Spark Project Hive Thrift Server .. FAILURE [8.097s]


[ERROR]
/spark/spark-1.4.1.hive.bak/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:25:
object ConsoleReader is not a member of package jline
[ERROR] import jline.{ConsoleReader, History}
[ERROR]^
[WARNING] Class jline.Completor not found - continuing with a stub.
[WARNING] Class jline.ConsoleReader not found - continuing with a stub.
[ERROR]
/spark/spark-1.4.1.hive.bak/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:171:
not found: type ConsoleReader
[ERROR] val reader = new ConsoleReader()
[ERROR]  ^
[ERROR] Class jline.Completor not found - continuing with a stub.


Re: Error Compiling Spark 1.4.1 w/ Scala 2.11 & Hive Support

2015-10-26 Thread Ted Yu
Scala 2.11 is supported in 1.5.1 release:

http://search.maven.org/#search%7Cga%7C1%7Ca%3A%22spark-parent_2.11%22

Can you upgrade ?

Cheers

On Mon, Oct 26, 2015 at 6:01 AM, Bryan Jeffrey <bryan.jeff...@gmail.com>
wrote:

> All,
>
> I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive
> support. Any ideas?
>
> mvn -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
> -Phive-thriftserver package
>
> [INFO] Spark Project Parent POM .. SUCCESS [4.124s]
> [INFO] Spark Launcher Project  SUCCESS [9.001s]
> [INFO] Spark Project Networking .. SUCCESS [7.871s]
> [INFO] Spark Project Shuffle Streaming Service ... SUCCESS [3.904s]
> [INFO] Spark Project Unsafe .. SUCCESS [3.095s]
> [INFO] Spark Project Core  SUCCESS
> [24.768s]
> [INFO] Spark Project Bagel ... SUCCESS [2.029s]
> [INFO] Spark Project GraphX .. SUCCESS [4.057s]
> [INFO] Spark Project Streaming ... SUCCESS [9.774s]
> [INFO] Spark Project Catalyst  SUCCESS [6.804s]
> [INFO] Spark Project SQL . SUCCESS [9.606s]
> [INFO] Spark Project ML Library .. SUCCESS
> [10.872s]
> [INFO] Spark Project Tools ... SUCCESS [0.627s]
> [INFO] Spark Project Hive  SUCCESS
> [13.463s]
> [INFO] Spark Project REPL  SUCCESS [1.414s]
> [INFO] Spark Project YARN  SUCCESS [2.433s]
> [INFO] Spark Project Hive Thrift Server .. FAILURE [8.097s]
>
>
> [ERROR]
> /spark/spark-1.4.1.hive.bak/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:25:
> object ConsoleReader is not a member of package jline
> [ERROR] import jline.{ConsoleReader, History}
> [ERROR]^
> [WARNING] Class jline.Completor not found - continuing with a stub.
> [WARNING] Class jline.ConsoleReader not found - continuing with a stub.
> [ERROR]
> /spark/spark-1.4.1.hive.bak/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:171:
> not found: type ConsoleReader
> [ERROR] val reader = new ConsoleReader()
> [ERROR]  ^
> [ERROR] Class jline.Completor not found - continuing with a stub.
>


Re: Building with SBT and Scala 2.11

2015-10-14 Thread Jakob Odersky
[Repost to mailing list]

Hey,
Sorry about the typo, I of course meant hadoop-2.6, not 2.11.
I suspect something bad happened with my Ivy cache, since when reverting
back to scala 2.10, I got a very strange IllegalStateException (something
something IvyNode, I can't remember the details).
Kilking the cache made 2.10 work at least, I'll retry with 2.11

Thx for your help
On Oct 14, 2015 6:52 AM, "Ted Yu" <yuzhih...@gmail.com> wrote:

> Adrian:
> Likely you were using maven.
>
> Jakob's report was with sbt.
>
> Cheers
>
> On Tue, Oct 13, 2015 at 10:05 PM, Adrian Tanase <atan...@adobe.com> wrote:
>
>> Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also
>> compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works.
>>
>> -adrian
>>
>> Sent from my iPhone
>>
>> On 14 Oct 2015, at 03:53, Jakob Odersky <joder...@gmail.com> wrote:
>>
>> I'm having trouble compiling Spark with SBT for Scala 2.11. The command I
>> use is:
>>
>> dev/change-version-to-2.11.sh
>> build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11
>>
>> followed by
>>
>> compile
>>
>> in the sbt shell.
>>
>> The error I get specifically is:
>>
>> spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308:
>> no valid targets for annotation on value conf - it is discarded unused. You
>> may specify targets with meta-annotations, e.g. @(transient @param)
>> [error] private[netty] class NettyRpcEndpointRef(@transient conf:
>> SparkConf)
>> [error]
>>
>> However I am also getting a large amount of deprecation warnings, making
>> me wonder if I am supplying some incompatible/unsupported options to sbt. I
>> am using Java 1.8 and the latest Spark master sources.
>> Does someone know if I am doing anything wrong or is the sbt build broken?
>>
>> thanks for you help,
>> --Jakob
>>
>>
>


Re: Building with SBT and Scala 2.11

2015-10-14 Thread Adrian Tanase
You are correct, of course. Gave up on sbt for spark long ago, I never managed 
to get it working while mvn works great.

Sent from my iPhone

On 14 Oct 2015, at 16:52, Ted Yu 
<yuzhih...@gmail.com<mailto:yuzhih...@gmail.com>> wrote:

Adrian:
Likely you were using maven.

Jakob's report was with sbt.

Cheers

On Tue, Oct 13, 2015 at 10:05 PM, Adrian Tanase 
<atan...@adobe.com<mailto:atan...@adobe.com>> wrote:
Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also 
compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works.

-adrian

Sent from my iPhone

On 14 Oct 2015, at 03:53, Jakob Odersky 
<joder...@gmail.com<mailto:joder...@gmail.com>> wrote:

I'm having trouble compiling Spark with SBT for Scala 2.11. The command I use 
is:

dev/change-version-to-2.11.sh<http://change-version-to-2.11.sh>
build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11

followed by

compile

in the sbt shell.

The error I get specifically is:

spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308: no 
valid targets for annotation on value conf - it is discarded unused. You may 
specify targets with meta-annotations, e.g. @(transient @param)
[error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
[error]

However I am also getting a large amount of deprecation warnings, making me 
wonder if I am supplying some incompatible/unsupported options to sbt. I am 
using Java 1.8 and the latest Spark master sources.
Does someone know if I am doing anything wrong or is the sbt build broken?

thanks for you help,
--Jakob




Re: Building with SBT and Scala 2.11

2015-10-14 Thread Ted Yu
Adrian:
Likely you were using maven.

Jakob's report was with sbt.

Cheers

On Tue, Oct 13, 2015 at 10:05 PM, Adrian Tanase <atan...@adobe.com> wrote:

> Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also
> compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works.
>
> -adrian
>
> Sent from my iPhone
>
> On 14 Oct 2015, at 03:53, Jakob Odersky <joder...@gmail.com> wrote:
>
> I'm having trouble compiling Spark with SBT for Scala 2.11. The command I
> use is:
>
> dev/change-version-to-2.11.sh
> build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11
>
> followed by
>
> compile
>
> in the sbt shell.
>
> The error I get specifically is:
>
> spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308:
> no valid targets for annotation on value conf - it is discarded unused. You
> may specify targets with meta-annotations, e.g. @(transient @param)
> [error] private[netty] class NettyRpcEndpointRef(@transient conf:
> SparkConf)
> [error]
>
> However I am also getting a large amount of deprecation warnings, making
> me wonder if I am supplying some incompatible/unsupported options to sbt. I
> am using Java 1.8 and the latest Spark master sources.
> Does someone know if I am doing anything wrong or is the sbt build broken?
>
> thanks for you help,
> --Jakob
>
>


Re: Building with SBT and Scala 2.11

2015-10-13 Thread Adrian Tanase
Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also 
compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works.

-adrian

Sent from my iPhone

On 14 Oct 2015, at 03:53, Jakob Odersky 
<joder...@gmail.com<mailto:joder...@gmail.com>> wrote:

I'm having trouble compiling Spark with SBT for Scala 2.11. The command I use 
is:

dev/change-version-to-2.11.sh<http://change-version-to-2.11.sh>
build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11

followed by

compile

in the sbt shell.

The error I get specifically is:

spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308: no 
valid targets for annotation on value conf - it is discarded unused. You may 
specify targets with meta-annotations, e.g. @(transient @param)
[error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
[error]

However I am also getting a large amount of deprecation warnings, making me 
wonder if I am supplying some incompatible/unsupported options to sbt. I am 
using Java 1.8 and the latest Spark master sources.
Does someone know if I am doing anything wrong or is the sbt build broken?

thanks for you help,
--Jakob



Building with SBT and Scala 2.11

2015-10-13 Thread Jakob Odersky
I'm having trouble compiling Spark with SBT for Scala 2.11. The command I
use is:

dev/change-version-to-2.11.sh
build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11

followed by

compile

in the sbt shell.

The error I get specifically is:

spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308:
no valid targets for annotation on value conf - it is discarded unused. You
may specify targets with meta-annotations, e.g. @(transient @param)
[error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
[error]

However I am also getting a large amount of deprecation warnings, making me
wonder if I am supplying some incompatible/unsupported options to sbt. I am
using Java 1.8 and the latest Spark master sources.
Does someone know if I am doing anything wrong or is the sbt build broken?

thanks for you help,
--Jakob


Re: Building with SBT and Scala 2.11

2015-10-13 Thread Ted Yu
See this thread: http://search-hadoop.com/m/q3RTtY7aX22B44dB

On Tue, Oct 13, 2015 at 5:53 PM, Jakob Odersky <joder...@gmail.com> wrote:

> I'm having trouble compiling Spark with SBT for Scala 2.11. The command I
> use is:
>
> dev/change-version-to-2.11.sh
> build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11
>
> followed by
>
> compile
>
> in the sbt shell.
>
> The error I get specifically is:
>
> spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308:
> no valid targets for annotation on value conf - it is discarded unused. You
> may specify targets with meta-annotations, e.g. @(transient @param)
> [error] private[netty] class NettyRpcEndpointRef(@transient conf:
> SparkConf)
> [error]
>
> However I am also getting a large amount of deprecation warnings, making
> me wonder if I am supplying some incompatible/unsupported options to sbt. I
> am using Java 1.8 and the latest Spark master sources.
> Does someone know if I am doing anything wrong or is the sbt build broken?
>
> thanks for you help,
> --Jakob
>
>


Launching EC2 instances with Spark compiled for Scala 2.11

2015-10-08 Thread Theodore Vasiloudis
Hello,

I was wondering if there is an easy way launch EC2 instances which have a
Spark built for Scala 2.11.

The only way I can think of is to prepare the sources for 2.11 as shown in
the Spark build instructions (
http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211),
upload the changed sources as a Github repo, and use the "--spark-git-repo"
option to specify the repo as the one to build from.

Is there a recommended way to launch EC2 instances if you need Scala 2.11
support?

Regards,
Theodore




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Launching-EC2-instances-with-Spark-compiled-for-Scala-2-11-tp24979.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Launching EC2 instances with Spark compiled for Scala 2.11

2015-10-08 Thread Aniket Bhatnagar
Is it possible for you to use EMR instead of EC2? If so, you may be able to
tweak EMR bootstrap scripts to install your custom spark build.

Thanks,
Aniket

On Thu, Oct 8, 2015 at 5:58 PM Theodore Vasiloudis <
theodoros.vasilou...@gmail.com> wrote:

> Hello,
>
> I was wondering if there is an easy way launch EC2 instances which have a
> Spark built for Scala 2.11.
>
> The only way I can think of is to prepare the sources for 2.11 as shown in
> the Spark build instructions (
> http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211),
> upload the changed sources as a Github repo, and use the "--spark-git-repo"
> option to specify the repo as the one to build from.
>
> Is there a recommended way to launch EC2 instances if you need Scala 2.11
> support?
>
> Regards,
> Theodore
>
> --
> View this message in context: Launching EC2 instances with Spark compiled
> for Scala 2.11
> <http://apache-spark-user-list.1001560.n3.nabble.com/Launching-EC2-instances-with-Spark-compiled-for-Scala-2-11-tp24979.html>
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>


Re: Spark does not yet support its JDBC component for Scala 2.11.

2015-09-21 Thread Ted Yu
I think the document should be updated to reflect the integration of
SPARK-8013 

Cheers

On Mon, Sep 21, 2015 at 3:48 AM, Petr Novak  wrote:

> Nice, thanks.
>
> So the note in build instruction for 2.11 is obsolete? Or there are still
> some limitations?
>
>
> http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
>
> On Fri, Sep 11, 2015 at 2:19 PM, Petr Novak  wrote:
>
>> Nice, thanks.
>>
>> So the note in build instruction for 2.11 is obsolete? Or there are still
>> some limitations?
>>
>>
>> http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
>>
>> On Fri, Sep 11, 2015 at 2:09 PM, Ted Yu  wrote:
>>
>>> Have you looked at:
>>> https://issues.apache.org/jira/browse/SPARK-8013
>>>
>>>
>>>
>>> > On Sep 11, 2015, at 4:53 AM, Petr Novak  wrote:
>>> >
>>> > Does it still apply for 1.5.0?
>>> >
>>> > What actual limitation does it mean when I switch to 2.11? No JDBC
>>> Thriftserver? No JDBC DataSource? No JdbcRDD (which is already obsolete I
>>> believe)? Some more?
>>> >
>>> > What library is the blocker to upgrade JDBC component to 2.11?
>>> >
>>> > Is there any estimate when it could be available for 2.11?
>>> >
>>> > Many thanks,
>>> > Petr
>>>
>>
>>
>


Re: Spark does not yet support its JDBC component for Scala 2.11.

2015-09-21 Thread Petr Novak
Nice, thanks.

So the note in build instruction for 2.11 is obsolete? Or there are still
some limitations?

http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211

On Fri, Sep 11, 2015 at 2:19 PM, Petr Novak  wrote:

> Nice, thanks.
>
> So the note in build instruction for 2.11 is obsolete? Or there are still
> some limitations?
>
>
> http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
>
> On Fri, Sep 11, 2015 at 2:09 PM, Ted Yu  wrote:
>
>> Have you looked at:
>> https://issues.apache.org/jira/browse/SPARK-8013
>>
>>
>>
>> > On Sep 11, 2015, at 4:53 AM, Petr Novak  wrote:
>> >
>> > Does it still apply for 1.5.0?
>> >
>> > What actual limitation does it mean when I switch to 2.11? No JDBC
>> Thriftserver? No JDBC DataSource? No JdbcRDD (which is already obsolete I
>> believe)? Some more?
>> >
>> > What library is the blocker to upgrade JDBC component to 2.11?
>> >
>> > Is there any estimate when it could be available for 2.11?
>> >
>> > Many thanks,
>> > Petr
>>
>
>


Re: Spark does not yet support its JDBC component for Scala 2.11.

2015-09-11 Thread Ted Yu
Have you looked at:
https://issues.apache.org/jira/browse/SPARK-8013



> On Sep 11, 2015, at 4:53 AM, Petr Novak  wrote:
> 
> Does it still apply for 1.5.0?
> 
> What actual limitation does it mean when I switch to 2.11? No JDBC 
> Thriftserver? No JDBC DataSource? No JdbcRDD (which is already obsolete I 
> believe)? Some more?
> 
> What library is the blocker to upgrade JDBC component to 2.11?
> 
> Is there any estimate when it could be available for 2.11?
> 
> Many thanks,
> Petr

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Spark does not yet support its JDBC component for Scala 2.11.

2015-09-11 Thread Petr Novak
Does it still apply for 1.5.0?

What actual limitation does it mean when I switch to 2.11? No JDBC
Thriftserver? No JDBC DataSource? No JdbcRDD (which is already obsolete I
believe)? Some more?

What library is the blocker to upgrade JDBC component to 2.11?

Is there any estimate when it could be available for 2.11?

Many thanks,
Petr


Re: Issue with building Spark v1.4.1-rc4 with Scala 2.11

2015-08-26 Thread Ted Yu
Have you run dev/change-version-to-2.11.sh ?

Cheers

On Wed, Aug 26, 2015 at 7:07 AM, Felix Neutatz neut...@googlemail.com
wrote:

 Hi everybody,

 I tried to build Spark v1.4.1-rc4 with Scala 2.11:
 ../apache-maven-3.3.3/bin/mvn -Dscala-2.11 -DskipTests clean install

 Before running this, I deleted:
 ../.m2/repository/org/apache/spark
 ../.m2/repository/org/spark-project

 My changes to the code:
 I just changed line 174 of org.apache.spark.executor.Executor$TaskRunner
 to:
 logInfo(stest Executor is trying to kill $taskName (TID $taskId))

 Everything builds without an error, but I have an issue.

 When I look into the jar of spark-core_2.10, I can see the changed string
 in Executor$TaskRunner$$anonfun$kill$1.class. But when I look
 into spark-core_2.11 the corresponding string didn't change. It seems like
 it downloads the jar from maven.

 Do you know what I did wrong?

 I also tried to run mvn -Dscala-2.11 -DskipTests clean install on the
 current master and got the following error:

 [ERROR] Failed to execute goal
 org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
 (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
 failed. Look above for specific messages explaining why the rule failed. -
 [Help 1]
 [ERROR]
 [ERROR] To see the full stack trace of the errors, re-run Maven with the
 -e switch.
 [ERROR] Re-run Maven using the -X switch to enable full debug logging.
 [ERROR]
 [ERROR] For more information about the errors and possible solutions,
 please read the following articles:
 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

 Thank you for your help.

 Best regards,
 Felix




Fwd: Issue with building Spark v1.4.1-rc4 with Scala 2.11

2015-08-26 Thread Felix Neutatz
Hi everybody,

I tried to build Spark v1.4.1-rc4 with Scala 2.11:
../apache-maven-3.3.3/bin/mvn -Dscala-2.11 -DskipTests clean install

Before running this, I deleted:
../.m2/repository/org/apache/spark
../.m2/repository/org/spark-project

My changes to the code:
I just changed line 174 of org.apache.spark.executor.Executor$TaskRunner
to:
logInfo(stest Executor is trying to kill $taskName (TID $taskId))

Everything builds without an error, but I have an issue.

When I look into the jar of spark-core_2.10, I can see the changed string
in Executor$TaskRunner$$anonfun$kill$1.class. But when I look
into spark-core_2.11 the corresponding string didn't change. It seems like
it downloads the jar from maven.

Do you know what I did wrong?

I also tried to run mvn -Dscala-2.11 -DskipTests clean install on the
current master and got the following error:

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
(enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
failed. Look above for specific messages explaining why the rule failed. -
[Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

Thank you for your help.

Best regards,
Felix


Re: spark and scala-2.11

2015-08-24 Thread Lanny Ripple
We're going to be upgrading from spark 1.0.2 and using hadoop-1.2.1 so need
to build by hand.  (Yes, I know. Use hadoop-2.x but standard resource
constraints apply.)  I want to build against scala-2.11 and publish to our
artifact repository but finding build/spark-2.10.4 and tracing down what
build/mvn was doing had me concerned that I was missing something.  I'll
hold the course and build it as instructed.

Thanks for the info, all.

PS - Since asked -- PATH=./build/apache-maven-3.2.5/bin:$PATH; build/mvn
-Phadoop-1 -Dhadoop.version=1.2.1 -Dscala-2.11 -DskipTests package

On Mon, Aug 24, 2015 at 2:49 PM, Jonathan Coveney jcove...@gmail.com
wrote:

 I've used the instructions and it worked fine.

 Can you post exactly what you're doing, and what it fails with? Or are you
 just trying to understand how it works?

 2015-08-24 15:48 GMT-04:00 Lanny Ripple la...@spotright.com:

 Hello,

 The instructions for building spark against scala-2.11 indicate using
 -Dspark-2.11.  When I look in the pom.xml I find a profile named
 'spark-2.11' but nothing that would indicate I should set a property.  The
 sbt build seems to need the -Dscala-2.11 property set.  Finally build/mvn
 does a simple grep of scala.version (which doesn't change after running dev/
 change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4
 scala library.

 Anyone know (from having done it and used it in production) if the build
 instructions for spark-1.4.1 against Scala-2.11 are correct?

 Thanks.
   -Lanny





Re: spark and scala-2.11

2015-08-24 Thread Jonathan Coveney
I've used the instructions and it worked fine.

Can you post exactly what you're doing, and what it fails with? Or are you
just trying to understand how it works?

2015-08-24 15:48 GMT-04:00 Lanny Ripple la...@spotright.com:

 Hello,

 The instructions for building spark against scala-2.11 indicate using
 -Dspark-2.11.  When I look in the pom.xml I find a profile named
 'spark-2.11' but nothing that would indicate I should set a property.  The
 sbt build seems to need the -Dscala-2.11 property set.  Finally build/mvn
 does a simple grep of scala.version (which doesn't change after running dev/
 change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4
 scala library.

 Anyone know (from having done it and used it in production) if the build
 instructions for spark-1.4.1 against Scala-2.11 are correct?

 Thanks.
   -Lanny



Re: spark and scala-2.11

2015-08-24 Thread Sean Owen
The property scala-2.11 triggers the profile scala-2.11 -- and
additionally disables the scala-2.10 profile, so that's the way to do
it. But yes, you also need to run the script before-hand to set up the
build for Scala 2.11 as well.

On Mon, Aug 24, 2015 at 8:48 PM, Lanny Ripple la...@spotright.com wrote:
 Hello,

 The instructions for building spark against scala-2.11 indicate using
 -Dspark-2.11.  When I look in the pom.xml I find a profile named
 'spark-2.11' but nothing that would indicate I should set a property.  The
 sbt build seems to need the -Dscala-2.11 property set.  Finally build/mvn
 does a simple grep of scala.version (which doesn't change after running
 dev/change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4
 scala library.

 Anyone know (from having done it and used it in production) if the build
 instructions for spark-1.4.1 against Scala-2.11 are correct?

 Thanks.
   -Lanny

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



spark and scala-2.11

2015-08-24 Thread Lanny Ripple
Hello,

The instructions for building spark against scala-2.11 indicate using
-Dspark-2.11.  When I look in the pom.xml I find a profile named
'spark-2.11' but nothing that would indicate I should set a property.  The
sbt build seems to need the -Dscala-2.11 property set.  Finally build/mvn
does a simple grep of scala.version (which doesn't change after running dev/
change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4
scala library.

Anyone know (from having done it and used it in production) if the build
instructions for spark-1.4.1 against Scala-2.11 are correct?

Thanks.
  -Lanny


Re: Spark on scala 2.11 build fails due to incorrect jline dependency in REPL

2015-08-17 Thread Ted Yu
You were building against 1.4.x, right ?

In master branch, switch-to-scala-2.11.sh is gone. There is scala-2.11
profile.

FYI

On Sun, Aug 16, 2015 at 11:12 AM, Stephen Boesch java...@gmail.com wrote:


 I am building spark with the following options - most notably the
 **scala-2.11**:

  . dev/switch-to-scala-2.11.sh
 mvn -Phive -Pyarn -Phadoop-2.6 -Dhadoop2.6.2 -Pscala-2.11 -DskipTests
 -Dmaven.javadoc.skip=true clean package


 The build goes pretty far but fails in one of the minor modules *repl*:

 [INFO]
 
 [ERROR] Failed to execute goal on project spark-repl_2.11: Could not
 resolve dependencies
 for project org.apache.spark:spark-repl_2.11:jar:1.5.0-SNAPSHOT:
  Could not   find artifact org.scala-lang:jline:jar:2.11.7 in central
  (https://repo1.maven.org/maven2) - [Help 1]

 Upon investigation - from 2.11.5 and later the scala version of jline is
 no longer required: they use the default jline distribution.

 And in fact the repl only shows dependency on jline for the 2.10.4 scala
 version:

 profile
   idscala-2.10/id
   activation
 propertyname!scala-2.11/name/property
   /activation
   properties
 scala.version2.10.4/scala.version
 scala.binary.version2.10/scala.binary.version
 jline.version${scala.version}/jline.version
 jline.groupidorg.scala-lang/jline.groupid
   /properties
   dependencyManagement
 dependencies
   dependency
 groupId${jline.groupid}/groupId
 artifactIdjline/artifactId
 version${jline.version}/version
   /dependency
 /dependencies
   /dependencyManagement
 /profile

 So then it is not clear why this error is occurring. Pointers appreciated.





Re: Spark on scala 2.11 build fails due to incorrect jline dependency in REPL

2015-08-17 Thread Stephen Boesch
In 1.4 it is change-scala-version.sh  2.11

But the problem was it is a -Dscala-211  not  a -P.  I misread the doc's.

2015-08-17 14:17 GMT-07:00 Ted Yu yuzhih...@gmail.com:

 You were building against 1.4.x, right ?

 In master branch, switch-to-scala-2.11.sh is gone. There is scala-2.11
 profile.

 FYI

 On Sun, Aug 16, 2015 at 11:12 AM, Stephen Boesch java...@gmail.com
 wrote:


 I am building spark with the following options - most notably the
 **scala-2.11**:

  . dev/switch-to-scala-2.11.sh
 mvn -Phive -Pyarn -Phadoop-2.6 -Dhadoop2.6.2 -Pscala-2.11 -DskipTests
 -Dmaven.javadoc.skip=true clean package


 The build goes pretty far but fails in one of the minor modules *repl*:

 [INFO]
 
 [ERROR] Failed to execute goal on project spark-repl_2.11: Could not
 resolve dependencies
 for project org.apache.spark:spark-repl_2.11:jar:1.5.0-SNAPSHOT:
  Could not   find artifact org.scala-lang:jline:jar:2.11.7 in central
  (https://repo1.maven.org/maven2) - [Help 1]

 Upon investigation - from 2.11.5 and later the scala version of jline is
 no longer required: they use the default jline distribution.

 And in fact the repl only shows dependency on jline for the 2.10.4 scala
 version:

 profile
   idscala-2.10/id
   activation
 propertyname!scala-2.11/name/property
   /activation
   properties
 scala.version2.10.4/scala.version
 scala.binary.version2.10/scala.binary.version
 jline.version${scala.version}/jline.version
 jline.groupidorg.scala-lang/jline.groupid
   /properties
   dependencyManagement
 dependencies
   dependency
 groupId${jline.groupid}/groupId
 artifactIdjline/artifactId
 version${jline.version}/version
   /dependency
 /dependencies
   /dependencyManagement
 /profile

 So then it is not clear why this error is occurring. Pointers appreciated.






Spark on scala 2.11 build fails due to incorrect jline dependency in REPL

2015-08-16 Thread Stephen Boesch
I am building spark with the following options - most notably the
**scala-2.11**:

 . dev/switch-to-scala-2.11.sh
mvn -Phive -Pyarn -Phadoop-2.6 -Dhadoop2.6.2 -Pscala-2.11 -DskipTests
-Dmaven.javadoc.skip=true clean package


The build goes pretty far but fails in one of the minor modules *repl*:

[INFO]

[ERROR] Failed to execute goal on project spark-repl_2.11: Could not
resolve dependencies
for project org.apache.spark:spark-repl_2.11:jar:1.5.0-SNAPSHOT:
 Could not   find artifact org.scala-lang:jline:jar:2.11.7 in central
 (https://repo1.maven.org/maven2) - [Help 1]

Upon investigation - from 2.11.5 and later the scala version of jline is no
longer required: they use the default jline distribution.

And in fact the repl only shows dependency on jline for the 2.10.4 scala
version:

profile
  idscala-2.10/id
  activation
propertyname!scala-2.11/name/property
  /activation
  properties
scala.version2.10.4/scala.version
scala.binary.version2.10/scala.binary.version
jline.version${scala.version}/jline.version
jline.groupidorg.scala-lang/jline.groupid
  /properties
  dependencyManagement
dependencies
  dependency
groupId${jline.groupid}/groupId
artifactIdjline/artifactId
version${jline.version}/version
  /dependency
/dependencies
  /dependencyManagement
/profile

So then it is not clear why this error is occurring. Pointers appreciated.


Re: master compile broken for scala 2.11

2015-07-14 Thread Josh Rosen
I've opened a PR to fix this; please take a look:
https://github.com/apache/spark/pull/7405

On Tue, Jul 14, 2015 at 11:22 AM, Koert Kuipers ko...@tresata.com wrote:

 it works for scala 2.10, but for 2.11 i get:

 [ERROR]
 /home/koert/src/spark/sql/catalyst/src/main/java/org/apache/spark/sql/execution/UnsafeExternalRowSorter.java:135:
 error: anonymous org.apache.spark.sql.execution.UnsafeExternalRowSorter$1
 is not abstract and does not override abstract method
 BminBy(Function1InternalRow,B,OrderingB) in TraversableOnce
 [ERROR]   return new AbstractScalaRowIterator() {




Re: Roadmap for Spark with Kafka on Scala 2.11?

2015-06-04 Thread Tathagata Das
But compile scope is supposed to be added to the assembly.
https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Scope



On Thu, Jun 4, 2015 at 1:24 PM, algermissen1971 algermissen1...@icloud.com
wrote:

 Hi Iulian,

 On 26 May 2015, at 13:04, Iulian Dragoș iulian.dra...@typesafe.com
 wrote:

 
  On Tue, May 26, 2015 at 10:09 AM, algermissen1971 
 algermissen1...@icloud.com wrote:
  Hi,
 
  I am setting up a project that requires Kafka support and I wonder what
 the roadmap is for Scala 2.11 Support (including Kafka).
 
  Can we expect to see 2.11 support anytime soon?
 
  The upcoming 1.4 release (now at RC2) includes support for Kafka and
 Scala 2.11.6. It'd be great if you could give it a try. You can find the
 binaries (and staging repository including 2.11 artifacts) here:
 
   https://www.mail-archive.com/dev@spark.apache.org/msg09347.html
 

 Feedback after a coupl eof days:

 - I am using 1.4.0-rc4 now without problems
 - Not used Kafka support yet
 - I am using this with akka-2.3.11 and akka-http 1.0-RC3 (and
 sbt-assembly) and this has produced a dependency nightmare. I am even
 adding guava manually to the assembly because I just could not get
 sbt-assembly to not complain.

 I am far from a good understanding of sbt / maven internals, but it seems
 that the ‘compile’ scope set in the spark POM for a lot of dependencies is
 somehow not honored and the libs end up causing conflicts in sbt-assembly.

 (I am writing this to share experience, not to complain. Thanks for the
 great work!!)

 onward...

 Jan





  iulian
 
 
  Jan
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 
 
 
 
  --
 
  --
  Iulian Dragos
 
  --
  Reactive Apps on the JVM
  www.typesafe.com
 


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Roadmap for Spark with Kafka on Scala 2.11?

2015-06-04 Thread algermissen1971
Hi Iulian,

On 26 May 2015, at 13:04, Iulian Dragoș iulian.dra...@typesafe.com wrote:

 
 On Tue, May 26, 2015 at 10:09 AM, algermissen1971 
 algermissen1...@icloud.com wrote:
 Hi,
 
 I am setting up a project that requires Kafka support and I wonder what the 
 roadmap is for Scala 2.11 Support (including Kafka).
 
 Can we expect to see 2.11 support anytime soon?
 
 The upcoming 1.4 release (now at RC2) includes support for Kafka and Scala 
 2.11.6. It'd be great if you could give it a try. You can find the binaries 
 (and staging repository including 2.11 artifacts) here:
 
  https://www.mail-archive.com/dev@spark.apache.org/msg09347.html
 

Feedback after a coupl eof days:

- I am using 1.4.0-rc4 now without problems
- Not used Kafka support yet
- I am using this with akka-2.3.11 and akka-http 1.0-RC3 (and sbt-assembly) and 
this has produced a dependency nightmare. I am even adding guava manually to 
the assembly because I just could not get sbt-assembly to not complain.

I am far from a good understanding of sbt / maven internals, but it seems that 
the ‘compile’ scope set in the spark POM for a lot of dependencies is somehow 
not honored and the libs end up causing conflicts in sbt-assembly.

(I am writing this to share experience, not to complain. Thanks for the great 
work!!)

onward...

Jan





 iulian
  
 
 Jan
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 
 
 
 
 -- 
 
 --
 Iulian Dragos
 
 --
 Reactive Apps on the JVM
 www.typesafe.com
 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Roadmap for Spark with Kafka on Scala 2.11?

2015-05-26 Thread Iulian Dragoș
On Tue, May 26, 2015 at 10:09 AM, algermissen1971 
algermissen1...@icloud.com wrote:

 Hi,

 I am setting up a project that requires Kafka support and I wonder what
 the roadmap is for Scala 2.11 Support (including Kafka).

 Can we expect to see 2.11 support anytime soon?


The upcoming 1.4 release (now at RC2) includes support for Kafka and Scala
2.11.6. It'd be great if you could give it a try. You can find the binaries
(and staging repository including 2.11 artifacts) here:

 https://www.mail-archive.com/dev@spark.apache.org/msg09347.html

iulian



 Jan
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


Re: spark-shell breaks for scala 2.11 (with yarn)?

2015-05-08 Thread Koert Kuipers
i searched the jiras but couldnt find any recent mention of this. let me
try with 1.4.0 branch and see if it goes away...

On Wed, May 6, 2015 at 3:05 PM, Koert Kuipers ko...@tresata.com wrote:

 hello all,
 i build spark 1.3.1 (for cdh 5.3 with yarn) twice: for scala 2.10 and
 scala 2.11. i am running on a secure cluster. the deployment configs are
 identical.

 i can launch jobs just fine on both the scala 2.10 and scala 2.11 versions.

 spark-shell works on the scala 2.10 version, but not on the scala 2.11
 version. this is what i get:

 $ /usr/local/lib/spark-1.3.1-cdh5.3-scala2.11/bin/spark-shell
 15/05/06 14:58:49 WARN NativeCodeLoader: Unable to load native-hadoop
 library for your platform... using builtin-java classes where applicable
 15/05/06 14:58:49 INFO SecurityManager: Changing view acls to: koert
 15/05/06 14:58:49 INFO SecurityManager: Changing modify acls to: koert
 15/05/06 14:58:49 INFO SecurityManager: SecurityManager: authentication
 disabled; ui acls disabled; users with view permissions: Set(koert); users
 with modify permissions: Set(koert)
 15/05/06 14:58:50 INFO HttpServer: Starting HTTP Server
 15/05/06 14:58:50 INFO Server: jetty-8.y.z-SNAPSHOT
 15/05/06 14:58:50 INFO AbstractConnector: Started
 SocketConnector@0.0.0.0:36413
 15/05/06 14:58:50 INFO Utils: Successfully started service 'HTTP server'
 on port 36413.
 Exception in thread main java.util.concurrent.TimeoutException: Futures
 timed out after [10 seconds]
 at
 scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
 at
 scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
 at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:95)
 at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:95)
 at
 scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
 at scala.concurrent.Await$.ready(package.scala:95)
 at
 scala.tools.nsc.interpreter.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:907)
 at
 scala.tools.nsc.interpreter.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:895)
 at
 scala.tools.nsc.interpreter.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:895)
 at
 scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:95)
 at scala.tools.nsc.interpreter.SparkILoop.process(SparkILoop.scala:895)
 at org.apache.spark.repl.Main$.main(Main.scala:46)
 at org.apache.spark.repl.Main.main(Main.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
 at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)




Re: branch-1.4 scala 2.11

2015-05-07 Thread Iulian Dragoș
There's an open PR to fix it: https://github.com/apache/spark/pull/5966

On Thu, May 7, 2015 at 6:07 PM, Koert Kuipers ko...@tresata.com wrote:

 i am having no luck using the 1.4 branch with scala 2.11

 $ build/mvn -DskipTests -Pyarn -Dscala-2.11 -Pscala-2.11 clean package

 [error]
 /home/koert/src/opensource/spark/core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala:78:
 in object RDDOperationScope, multiple overloaded alternatives of method
 withScope define default arguments.
 [error] private[spark] object RDDOperationScope {
 [error]




-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


branch-1.4 scala 2.11

2015-05-07 Thread Koert Kuipers
i am having no luck using the 1.4 branch with scala 2.11

$ build/mvn -DskipTests -Pyarn -Dscala-2.11 -Pscala-2.11 clean package

[error]
/home/koert/src/opensource/spark/core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala:78:
in object RDDOperationScope, multiple overloaded alternatives of method
withScope define default arguments.
[error] private[spark] object RDDOperationScope {
[error]


spark-shell breaks for scala 2.11 (with yarn)?

2015-05-06 Thread Koert Kuipers
hello all,
i build spark 1.3.1 (for cdh 5.3 with yarn) twice: for scala 2.10 and scala
2.11. i am running on a secure cluster. the deployment configs are
identical.

i can launch jobs just fine on both the scala 2.10 and scala 2.11 versions.

spark-shell works on the scala 2.10 version, but not on the scala 2.11
version. this is what i get:

$ /usr/local/lib/spark-1.3.1-cdh5.3-scala2.11/bin/spark-shell
15/05/06 14:58:49 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/05/06 14:58:49 INFO SecurityManager: Changing view acls to: koert
15/05/06 14:58:49 INFO SecurityManager: Changing modify acls to: koert
15/05/06 14:58:49 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(koert); users
with modify permissions: Set(koert)
15/05/06 14:58:50 INFO HttpServer: Starting HTTP Server
15/05/06 14:58:50 INFO Server: jetty-8.y.z-SNAPSHOT
15/05/06 14:58:50 INFO AbstractConnector: Started
SocketConnector@0.0.0.0:36413
15/05/06 14:58:50 INFO Utils: Successfully started service 'HTTP server' on
port 36413.
Exception in thread main java.util.concurrent.TimeoutException: Futures
timed out after [10 seconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:95)
at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:95)
at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.ready(package.scala:95)
at
scala.tools.nsc.interpreter.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:907)
at
scala.tools.nsc.interpreter.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:895)
at
scala.tools.nsc.interpreter.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:895)
at
scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:95)
at scala.tools.nsc.interpreter.SparkILoop.process(SparkILoop.scala:895)
at org.apache.spark.repl.Main$.main(Main.scala:46)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Michael Allman
FWIW, this is an essential feature to our use of Spark, and I'm surprised it's not advertised clearly as a limitation in the documentation. All I've found about running Spark 1.3 on 2.11 is here:http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211Also, I'm experiencing some serious stability problems simply trying to run the Spark 1.3 Scala 2.11 REPL. Most of the time it fails to load and spews a torrent of compiler assertion failures, etc. See attached.spark@dp-cluster-master-node-001:~/spark/bin$ spark-shell
Spark Command: java -cp 
/opt/spark/conf:/opt/spark/lib/spark-assembly-1.3.2-SNAPSHOT-hadoop2.5.0-cdh5.3.3.jar:/etc/hadoop/conf:/opt/spark/lib/jline-2.12.jar
 -Dscala.usejavacp=true -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit 
--class org.apache.spark.repl.Main spark-shell


Welcome to
    __
 / __/__  ___ _/ /__
_\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.3.1
  /_/
 
Using Scala version 2.11.2 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
Exception in thread main java.lang.AssertionError: assertion failed: parser: 
(source: String, options: Map[String,String])org.apache.spark.sql.DataFrame, 
tailcalls: (source: String, options: 
scala.collection.immutable.Map[String,String])org.apache.spark.sql.DataFrame, 
tailcalls: (source: String, options: 
scala.collection.immutable.Map)org.apache.spark.sql.DataFrame
at scala.reflect.internal.Symbols$TypeHistory.init(Symbols.scala:3601)
at scala.reflect.internal.Symbols$Symbol.rawInfo(Symbols.scala:1521)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1439)
at 
scala.tools.nsc.transform.SpecializeTypes$$anonfun$23$$anonfun$apply$20.apply(SpecializeTypes.scala:775)
at 
scala.tools.nsc.transform.SpecializeTypes$$anonfun$23$$anonfun$apply$20.apply(SpecializeTypes.scala:768)
at scala.collection.immutable.List.flatMap(List.scala:327)
at 
scala.tools.nsc.transform.SpecializeTypes$$anonfun$23.apply(SpecializeTypes.scala:768)
at 
scala.tools.nsc.transform.SpecializeTypes$$anonfun$23.apply(SpecializeTypes.scala:766)
at scala.collection.immutable.List.flatMap(List.scala:327)
at 
scala.tools.nsc.transform.SpecializeTypes.specializeClass(SpecializeTypes.scala:766)
at 
scala.tools.nsc.transform.SpecializeTypes.transformInfo(SpecializeTypes.scala:1187)
at 
scala.tools.nsc.transform.InfoTransform$Phase$$anon$1.transform(InfoTransform.scala:38)
at scala.reflect.internal.Symbols$Symbol.rawInfo(Symbols.scala:1519)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1439)
at scala.reflect.internal.Symbols$Symbol.isDerivedValueClass(Symbols.scala:775)
at scala.reflect.internal.transform.Erasure$ErasureMap.apply(Erasure.scala:131)
at scala.reflect.internal.transform.Erasure$ErasureMap.apply(Erasure.scala:144)
at 
scala.reflect.internal.transform.Erasure$class.specialErasure(Erasure.scala:209)
at scala.tools.nsc.transform.Erasure.specialErasure(Erasure.scala:15)
at 
scala.reflect.internal.transform.Erasure$class.transformInfo(Erasure.scala:364)
at scala.tools.nsc.transform.Erasure.transformInfo(Erasure.scala:348)
at 
scala.tools.nsc.transform.InfoTransform$Phase$$anon$1.transform(InfoTransform.scala:38)
at scala.reflect.internal.Symbols$Symbol.rawInfo(Symbols.scala:1519)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1439)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer$$anonfun$checkNoDeclaredDoubleDefs$1$$anonfun$apply$mcV$sp$2.apply(Erasure.scala:753)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer$$anonfun$checkNoDeclaredDoubleDefs$1$$anonfun$apply$mcV$sp$2.apply(Erasure.scala:753)
at scala.reflect.internal.Scopes$Scope.foreach(Scopes.scala:373)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer$$anonfun$checkNoDeclaredDoubleDefs$1.apply(Erasure.scala:753)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer$$anonfun$checkNoDeclaredDoubleDefs$1.apply(Erasure.scala:753)
at scala.reflect.internal.SymbolTable.enteringPhase(SymbolTable.scala:235)
at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer.checkNoDeclaredDoubleDefs(Erasure.scala:753)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer.scala$tools$nsc$transform$Erasure$ErasureTransformer$$checkNoDoubleDefs(Erasure.scala:780)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer$$anon$1.preErase(Erasure.scala:1074)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer$$anon$1.transform(Erasure.scala:1109)
at 
scala.tools.nsc.transform.Erasure$ErasureTransformer$$anon$1.transform(Erasure.scala:841)
at scala.reflect.api.Trees$Transformer.transformTemplate(Trees.scala:2563)
at scala.reflect.internal.Trees$$anonfun$itransform$4.apply(Trees.scala:1401)
at scala.reflect.internal.Trees$$anonfun$itransform$4.apply(Trees.scala:1400

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Sean Owen
Doesn't this reduce to Scala isn't compatible with itself across
maintenance releases? Meaning, if this were fixed then Scala
2.11.{x  6} would have similar failures. It's not not-ready; it's
just not the Scala 2.11.6 REPL. Still, sure I'd favor breaking the
unofficial support to at least make the latest Scala 2.11 the unbroken
one.

On Fri, Apr 17, 2015 at 7:58 AM, Michael Allman mich...@videoamp.com wrote:
 FWIW, this is an essential feature to our use of Spark, and I'm surprised
 it's not advertised clearly as a limitation in the documentation. All I've
 found about running Spark 1.3 on 2.11 is here:

 http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211

 Also, I'm experiencing some serious stability problems simply trying to run
 the Spark 1.3 Scala 2.11 REPL. Most of the time it fails to load and spews a
 torrent of compiler assertion failures, etc. See attached.



 Unfortunately, it appears the Spark 1.3 Scala 2.11 REPL is simply not ready
 for production use. I was going to file a bug, but it seems clear that the
 current implementation is going to need to be forward-ported to Scala 2.11.6
 anyway. We already have an issue for that:

 https://issues.apache.org/jira/browse/SPARK-6155

 Michael


 On Apr 9, 2015, at 10:29 PM, Prashant Sharma scrapco...@gmail.com wrote:

 You will have to go to this commit ID
 191d7cf2a655d032f160b9fa181730364681d0e7 in Apache spark. [1] Once you are
 at that commit, you need to review the changes done to the repl code and
 look for the relevant occurrences of the same code in scala 2.11 repl source
 and somehow make it all work.


 Thanks,





 1. http://githowto.com/getting_old_versions

 Prashant Sharma



 On Thu, Apr 9, 2015 at 4:40 PM, Alex Nakos ana...@gmail.com wrote:

 Ok, what do i need to do in order to migrate the patch?

 Thanks
 Alex

 On Thu, Apr 9, 2015 at 11:54 AM, Prashant Sharma scrapco...@gmail.com
 wrote:

 This is the jira I referred to
 https://issues.apache.org/jira/browse/SPARK-3256. Another reason for not
 working on it is evaluating priority between upgrading to scala 2.11.5(it is
 non trivial I suppose because repl has changed a bit) or migrating that
 patch is much simpler.

 Prashant Sharma



 On Thu, Apr 9, 2015 at 4:16 PM, Alex Nakos ana...@gmail.com wrote:

 Hi-

 Was this the JIRA issue?
 https://issues.apache.org/jira/browse/SPARK-2988

 Any help in getting this working would be much appreciated!

 Thanks
 Alex

 On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma scrapco...@gmail.com
 wrote:

 You are right this needs to be done. I can work on it soon, I was not
 sure if there is any one even using scala 2.11 spark repl. Actually there 
 is
 a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), 
 which
 has to be ported for scala 2.11 too. If however, you(or anyone else) are
 planning to work, I can help you ?

 Prashant Sharma



 On Thu, Apr 9, 2015 at 3:08 PM, anakos ana...@gmail.com wrote:

 Hi-

 I am having difficulty getting the 1.3.0 Spark shell to find an
 external
 jar.  I have build Spark locally for Scala 2.11 and I am starting the
 REPL
 as follows:

 bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar

 I see the following line in the console output:

 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR

 file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
 at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar
 with
 timestamp 1428569535904

 but when i try to import anything from this jar, it's simply not
 available.
 When I try to add the jar manually using the command

 :cp /path/to/jar

 the classes in the jar are still unavailable. I understand that 2.11
 is not
 officially supported, but has anyone been able to get an external jar
 loaded
 in the 1.3.0 release?  Is this a known issue? I have tried searching
 around
 for answers but the only thing I've found that may be related is this:

 https://issues.apache.org/jira/browse/SPARK-3257

 Any/all help is much appreciated.
 Thanks
 Alex



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html
 Sent from the Apache Spark User List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org









-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Michael Allman
H... I don't follow. The 2.11.x series is supposed to be binary compatible 
against user code. Anyway, I was building Spark against 2.11.2 and still saw 
the problems with the REPL. I've created a bug report:

https://issues.apache.org/jira/browse/SPARK-6989 
https://issues.apache.org/jira/browse/SPARK-6989

I hope this helps.

Cheers,

Michael

 On Apr 17, 2015, at 1:41 AM, Sean Owen so...@cloudera.com wrote:
 
 Doesn't this reduce to Scala isn't compatible with itself across
 maintenance releases? Meaning, if this were fixed then Scala
 2.11.{x  6} would have similar failures. It's not not-ready; it's
 just not the Scala 2.11.6 REPL. Still, sure I'd favor breaking the
 unofficial support to at least make the latest Scala 2.11 the unbroken
 one.
 
 On Fri, Apr 17, 2015 at 7:58 AM, Michael Allman mich...@videoamp.com wrote:
 FWIW, this is an essential feature to our use of Spark, and I'm surprised
 it's not advertised clearly as a limitation in the documentation. All I've
 found about running Spark 1.3 on 2.11 is here:
 
 http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
 
 Also, I'm experiencing some serious stability problems simply trying to run
 the Spark 1.3 Scala 2.11 REPL. Most of the time it fails to load and spews a
 torrent of compiler assertion failures, etc. See attached.
 
 
 
 Unfortunately, it appears the Spark 1.3 Scala 2.11 REPL is simply not ready
 for production use. I was going to file a bug, but it seems clear that the
 current implementation is going to need to be forward-ported to Scala 2.11.6
 anyway. We already have an issue for that:
 
 https://issues.apache.org/jira/browse/SPARK-6155
 
 Michael
 
 
 On Apr 9, 2015, at 10:29 PM, Prashant Sharma scrapco...@gmail.com wrote:
 
 You will have to go to this commit ID
 191d7cf2a655d032f160b9fa181730364681d0e7 in Apache spark. [1] Once you are
 at that commit, you need to review the changes done to the repl code and
 look for the relevant occurrences of the same code in scala 2.11 repl source
 and somehow make it all work.
 
 
 Thanks,
 
 
 
 
 
 1. http://githowto.com/getting_old_versions
 
 Prashant Sharma
 
 
 
 On Thu, Apr 9, 2015 at 4:40 PM, Alex Nakos ana...@gmail.com wrote:
 
 Ok, what do i need to do in order to migrate the patch?
 
 Thanks
 Alex
 
 On Thu, Apr 9, 2015 at 11:54 AM, Prashant Sharma scrapco...@gmail.com
 wrote:
 
 This is the jira I referred to
 https://issues.apache.org/jira/browse/SPARK-3256. Another reason for not
 working on it is evaluating priority between upgrading to scala 2.11.5(it 
 is
 non trivial I suppose because repl has changed a bit) or migrating that
 patch is much simpler.
 
 Prashant Sharma
 
 
 
 On Thu, Apr 9, 2015 at 4:16 PM, Alex Nakos ana...@gmail.com wrote:
 
 Hi-
 
 Was this the JIRA issue?
 https://issues.apache.org/jira/browse/SPARK-2988
 
 Any help in getting this working would be much appreciated!
 
 Thanks
 Alex
 
 On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma scrapco...@gmail.com
 wrote:
 
 You are right this needs to be done. I can work on it soon, I was not
 sure if there is any one even using scala 2.11 spark repl. Actually 
 there is
 a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), 
 which
 has to be ported for scala 2.11 too. If however, you(or anyone else) are
 planning to work, I can help you ?
 
 Prashant Sharma
 
 
 
 On Thu, Apr 9, 2015 at 3:08 PM, anakos ana...@gmail.com wrote:
 
 Hi-
 
 I am having difficulty getting the 1.3.0 Spark shell to find an
 external
 jar.  I have build Spark locally for Scala 2.11 and I am starting the
 REPL
 as follows:
 
 bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar
 
 I see the following line in the console output:
 
 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR
 
 file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
 at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar
 with
 timestamp 1428569535904
 
 but when i try to import anything from this jar, it's simply not
 available.
 When I try to add the jar manually using the command
 
 :cp /path/to/jar
 
 the classes in the jar are still unavailable. I understand that 2.11
 is not
 officially supported, but has anyone been able to get an external jar
 loaded
 in the 1.3.0 release?  Is this a known issue? I have tried searching
 around
 for answers but the only thing I've found that may be related is this:
 
 https://issues.apache.org/jira/browse/SPARK-3257
 
 Any/all help is much appreciated.
 Thanks
 Alex
 
 
 
 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html
 Sent from the Apache Spark User List mailing list archive at
 Nabble.com.
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 
 
 
 
 
 
 
 



Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Sean Owen
You are running on 2.11.6, right? of course, it seems like that should
all work, but it doesn't work for you. My point is that the shell you
are saying doesn't work is Scala's 2.11.2 shell -- with some light
modification.

It's possible that the delta is the problem. I can't entirely make out
whether the errors are Spark-specific; they involve Spark classes in
some cases but they're assertion errors from Scala libraries.

I don't know if this shell is supposed to work even across maintenance
releases as-is, though that would be very nice. It's not an API for
Scala.

A good test of whether this idea has any merit would be to run with
Scala 2.11.2. I'll copy this to JIRA for continuation.

On Fri, Apr 17, 2015 at 10:31 PM, Michael Allman mich...@videoamp.com wrote:
 H... I don't follow. The 2.11.x series is supposed to be binary
 compatible against user code. Anyway, I was building Spark against 2.11.2
 and still saw the problems with the REPL. I've created a bug report:

 https://issues.apache.org/jira/browse/SPARK-6989

 I hope this helps.

 Cheers,

 Michael

 On Apr 17, 2015, at 1:41 AM, Sean Owen so...@cloudera.com wrote:

 Doesn't this reduce to Scala isn't compatible with itself across
 maintenance releases? Meaning, if this were fixed then Scala
 2.11.{x  6} would have similar failures. It's not not-ready; it's
 just not the Scala 2.11.6 REPL. Still, sure I'd favor breaking the
 unofficial support to at least make the latest Scala 2.11 the unbroken
 one.

 On Fri, Apr 17, 2015 at 7:58 AM, Michael Allman mich...@videoamp.com
 wrote:

 FWIW, this is an essential feature to our use of Spark, and I'm surprised
 it's not advertised clearly as a limitation in the documentation. All I've
 found about running Spark 1.3 on 2.11 is here:

 http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211

 Also, I'm experiencing some serious stability problems simply trying to run
 the Spark 1.3 Scala 2.11 REPL. Most of the time it fails to load and spews a
 torrent of compiler assertion failures, etc. See attached.



 Unfortunately, it appears the Spark 1.3 Scala 2.11 REPL is simply not ready
 for production use. I was going to file a bug, but it seems clear that the
 current implementation is going to need to be forward-ported to Scala 2.11.6
 anyway. We already have an issue for that:

 https://issues.apache.org/jira/browse/SPARK-6155

 Michael


 On Apr 9, 2015, at 10:29 PM, Prashant Sharma scrapco...@gmail.com wrote:

 You will have to go to this commit ID
 191d7cf2a655d032f160b9fa181730364681d0e7 in Apache spark. [1] Once you are
 at that commit, you need to review the changes done to the repl code and
 look for the relevant occurrences of the same code in scala 2.11 repl source
 and somehow make it all work.


 Thanks,





 1. http://githowto.com/getting_old_versions

 Prashant Sharma



 On Thu, Apr 9, 2015 at 4:40 PM, Alex Nakos ana...@gmail.com wrote:


 Ok, what do i need to do in order to migrate the patch?

 Thanks
 Alex

 On Thu, Apr 9, 2015 at 11:54 AM, Prashant Sharma scrapco...@gmail.com
 wrote:


 This is the jira I referred to
 https://issues.apache.org/jira/browse/SPARK-3256. Another reason for not
 working on it is evaluating priority between upgrading to scala 2.11.5(it is
 non trivial I suppose because repl has changed a bit) or migrating that
 patch is much simpler.

 Prashant Sharma



 On Thu, Apr 9, 2015 at 4:16 PM, Alex Nakos ana...@gmail.com wrote:


 Hi-

 Was this the JIRA issue?
 https://issues.apache.org/jira/browse/SPARK-2988

 Any help in getting this working would be much appreciated!

 Thanks
 Alex

 On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma scrapco...@gmail.com
 wrote:


 You are right this needs to be done. I can work on it soon, I was not
 sure if there is any one even using scala 2.11 spark repl. Actually there is
 a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), which
 has to be ported for scala 2.11 too. If however, you(or anyone else) are
 planning to work, I can help you ?

 Prashant Sharma



 On Thu, Apr 9, 2015 at 3:08 PM, anakos ana...@gmail.com wrote:


 Hi-

 I am having difficulty getting the 1.3.0 Spark shell to find an
 external
 jar.  I have build Spark locally for Scala 2.11 and I am starting the
 REPL
 as follows:

 bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar

 I see the following line in the console output:

 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR

 file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
 at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar
 with
 timestamp 1428569535904

 but when i try to import anything from this jar, it's simply not
 available.
 When I try to add the jar manually using the command

 :cp /path/to/jar

 the classes in the jar are still unavailable. I understand that 2.11
 is not
 officially supported, but has anyone been able to get an external jar
 loaded
 in the 1.3.0 release

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Michael Allman
I actually just saw your comment on SPARK-6989 before this message. So I'll 
copy to the mailing list:

I'm not sure I understand what you mean about running on 2.11.6. I'm just 
running the spark-shell command. It in turn is running


  java -cp 
/opt/spark/conf:/opt/spark/lib/spark-assembly-1.3.2-SNAPSHOT-hadoop2.5.0-cdh5.3.3.jar:/etc/hadoop/conf:/opt/spark/lib/jline-2.12.jar
 \
-Dscala.usejavacp=true -Xms512m -Xmx512m 
org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main 
spark-shell


I built Spark with the included build/mvn script. As far as I can tell, the 
only reference to a specific version of Scala is in the top-level pom file, and 
it says 2.11.2.

 On Apr 17, 2015, at 9:57 PM, Sean Owen so...@cloudera.com wrote:
 
 You are running on 2.11.6, right? of course, it seems like that should
 all work, but it doesn't work for you. My point is that the shell you
 are saying doesn't work is Scala's 2.11.2 shell -- with some light
 modification.
 
 It's possible that the delta is the problem. I can't entirely make out
 whether the errors are Spark-specific; they involve Spark classes in
 some cases but they're assertion errors from Scala libraries.
 
 I don't know if this shell is supposed to work even across maintenance
 releases as-is, though that would be very nice. It's not an API for
 Scala.
 
 A good test of whether this idea has any merit would be to run with
 Scala 2.11.2. I'll copy this to JIRA for continuation.
 
 On Fri, Apr 17, 2015 at 10:31 PM, Michael Allman mich...@videoamp.com wrote:
 H... I don't follow. The 2.11.x series is supposed to be binary
 compatible against user code. Anyway, I was building Spark against 2.11.2
 and still saw the problems with the REPL. I've created a bug report:
 
 https://issues.apache.org/jira/browse/SPARK-6989
 
 I hope this helps.
 
 Cheers,
 
 Michael
 
 On Apr 17, 2015, at 1:41 AM, Sean Owen so...@cloudera.com wrote:
 
 Doesn't this reduce to Scala isn't compatible with itself across
 maintenance releases? Meaning, if this were fixed then Scala
 2.11.{x  6} would have similar failures. It's not not-ready; it's
 just not the Scala 2.11.6 REPL. Still, sure I'd favor breaking the
 unofficial support to at least make the latest Scala 2.11 the unbroken
 one.
 
 On Fri, Apr 17, 2015 at 7:58 AM, Michael Allman mich...@videoamp.com
 wrote:
 
 FWIW, this is an essential feature to our use of Spark, and I'm surprised
 it's not advertised clearly as a limitation in the documentation. All I've
 found about running Spark 1.3 on 2.11 is here:
 
 http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
 
 Also, I'm experiencing some serious stability problems simply trying to run
 the Spark 1.3 Scala 2.11 REPL. Most of the time it fails to load and spews a
 torrent of compiler assertion failures, etc. See attached.
 
 
 
 Unfortunately, it appears the Spark 1.3 Scala 2.11 REPL is simply not ready
 for production use. I was going to file a bug, but it seems clear that the
 current implementation is going to need to be forward-ported to Scala 2.11.6
 anyway. We already have an issue for that:
 
 https://issues.apache.org/jira/browse/SPARK-6155
 
 Michael
 
 
 On Apr 9, 2015, at 10:29 PM, Prashant Sharma scrapco...@gmail.com wrote:
 
 You will have to go to this commit ID
 191d7cf2a655d032f160b9fa181730364681d0e7 in Apache spark. [1] Once you are
 at that commit, you need to review the changes done to the repl code and
 look for the relevant occurrences of the same code in scala 2.11 repl source
 and somehow make it all work.
 
 
 Thanks,
 
 
 
 
 
 1. http://githowto.com/getting_old_versions
 
 Prashant Sharma
 
 
 
 On Thu, Apr 9, 2015 at 4:40 PM, Alex Nakos ana...@gmail.com wrote:
 
 
 Ok, what do i need to do in order to migrate the patch?
 
 Thanks
 Alex
 
 On Thu, Apr 9, 2015 at 11:54 AM, Prashant Sharma scrapco...@gmail.com
 wrote:
 
 
 This is the jira I referred to
 https://issues.apache.org/jira/browse/SPARK-3256. Another reason for not
 working on it is evaluating priority between upgrading to scala 2.11.5(it is
 non trivial I suppose because repl has changed a bit) or migrating that
 patch is much simpler.
 
 Prashant Sharma
 
 
 
 On Thu, Apr 9, 2015 at 4:16 PM, Alex Nakos ana...@gmail.com wrote:
 
 
 Hi-
 
 Was this the JIRA issue?
 https://issues.apache.org/jira/browse/SPARK-2988
 
 Any help in getting this working would be much appreciated!
 
 Thanks
 Alex
 
 On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma scrapco...@gmail.com
 wrote:
 
 
 You are right this needs to be done. I can work on it soon, I was not
 sure if there is any one even using scala 2.11 spark repl. Actually there is
 a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), which
 has to be ported for scala 2.11 too. If however, you(or anyone else) are
 planning to work, I can help you ?
 
 Prashant Sharma
 
 
 
 On Thu, Apr 9, 2015 at 3:08 PM, anakos ana...@gmail.com wrote:
 
 
 Hi-
 
 I am having difficulty getting

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-09 Thread Alex Nakos
Ok, what do i need to do in order to migrate the patch?

Thanks
Alex

On Thu, Apr 9, 2015 at 11:54 AM, Prashant Sharma scrapco...@gmail.com
wrote:

 This is the jira I referred to
 https://issues.apache.org/jira/browse/SPARK-3256. Another reason for not
 working on it is evaluating priority between upgrading to scala 2.11.5(it
 is non trivial I suppose because repl has changed a bit) or migrating that
 patch is much simpler.

 Prashant Sharma



 On Thu, Apr 9, 2015 at 4:16 PM, Alex Nakos ana...@gmail.com wrote:

 Hi-

 Was this the JIRA issue? https://issues.apache.org/jira/browse/SPARK-2988

 Any help in getting this working would be much appreciated!

 Thanks
 Alex

 On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma scrapco...@gmail.com
 wrote:

 You are right this needs to be done. I can work on it soon, I was not
 sure if there is any one even using scala 2.11 spark repl. Actually there
 is a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID),
 which has to be ported for scala 2.11 too. If however, you(or anyone else)
 are planning to work, I can help you ?

 Prashant Sharma



 On Thu, Apr 9, 2015 at 3:08 PM, anakos ana...@gmail.com wrote:

 Hi-

 I am having difficulty getting the 1.3.0 Spark shell to find an external
 jar.  I have build Spark locally for Scala 2.11 and I am starting the
 REPL
 as follows:

 bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar

 I see the following line in the console output:

 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR

 file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
 at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar
 with
 timestamp 1428569535904

 but when i try to import anything from this jar, it's simply not
 available.
 When I try to add the jar manually using the command

 :cp /path/to/jar

 the classes in the jar are still unavailable. I understand that 2.11 is
 not
 officially supported, but has anyone been able to get an external jar
 loaded
 in the 1.3.0 release?  Is this a known issue? I have tried searching
 around
 for answers but the only thing I've found that may be related is this:

 https://issues.apache.org/jira/browse/SPARK-3257

 Any/all help is much appreciated.
 Thanks
 Alex



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org







Re: External JARs not loading Spark Shell Scala 2.11

2015-04-09 Thread Alex Nakos
Hi-

Was this the JIRA issue? https://issues.apache.org/jira/browse/SPARK-2988

Any help in getting this working would be much appreciated!

Thanks
Alex

On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma scrapco...@gmail.com
wrote:

 You are right this needs to be done. I can work on it soon, I was not sure
 if there is any one even using scala 2.11 spark repl. Actually there is a
 patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), which
 has to be ported for scala 2.11 too. If however, you(or anyone else) are
 planning to work, I can help you ?

 Prashant Sharma



 On Thu, Apr 9, 2015 at 3:08 PM, anakos ana...@gmail.com wrote:

 Hi-

 I am having difficulty getting the 1.3.0 Spark shell to find an external
 jar.  I have build Spark locally for Scala 2.11 and I am starting the REPL
 as follows:

 bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar

 I see the following line in the console output:

 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR

 file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
 at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar
 with
 timestamp 1428569535904

 but when i try to import anything from this jar, it's simply not
 available.
 When I try to add the jar manually using the command

 :cp /path/to/jar

 the classes in the jar are still unavailable. I understand that 2.11 is
 not
 officially supported, but has anyone been able to get an external jar
 loaded
 in the 1.3.0 release?  Is this a known issue? I have tried searching
 around
 for answers but the only thing I've found that may be related is this:

 https://issues.apache.org/jira/browse/SPARK-3257

 Any/all help is much appreciated.
 Thanks
 Alex



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Re: External JARs not loading Spark Shell Scala 2.11

2015-04-09 Thread Prashant Sharma
You are right this needs to be done. I can work on it soon, I was not sure
if there is any one even using scala 2.11 spark repl. Actually there is a
patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), which
has to be ported for scala 2.11 too. If however, you(or anyone else) are
planning to work, I can help you ?

Prashant Sharma



On Thu, Apr 9, 2015 at 3:08 PM, anakos ana...@gmail.com wrote:

 Hi-

 I am having difficulty getting the 1.3.0 Spark shell to find an external
 jar.  I have build Spark locally for Scala 2.11 and I am starting the REPL
 as follows:

 bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar

 I see the following line in the console output:

 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR

 file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
 at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar with
 timestamp 1428569535904

 but when i try to import anything from this jar, it's simply not available.
 When I try to add the jar manually using the command

 :cp /path/to/jar

 the classes in the jar are still unavailable. I understand that 2.11 is not
 officially supported, but has anyone been able to get an external jar
 loaded
 in the 1.3.0 release?  Is this a known issue? I have tried searching around
 for answers but the only thing I've found that may be related is this:

 https://issues.apache.org/jira/browse/SPARK-3257

 Any/all help is much appreciated.
 Thanks
 Alex



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Building Spark 1.3 for Scala 2.11 using Maven

2015-03-12 Thread Fernando O.
Just FYI: what @Marcelo said fixed the issue for me.

On Fri, Mar 6, 2015 at 7:11 AM, Sean Owen so...@cloudera.com wrote:

 -Pscala-2.11 and -Dscala-2.11 will happen to do the same thing for this
 profile.

 Why are you running install package and not just install? Probably
 doesn't matter.

 This sounds like you are trying to only build core without building
 everything else, which you can't do in general unless you already
 built and installed these snapshot artifacts locally.

 On Fri, Mar 6, 2015 at 12:46 AM, Night Wolf nightwolf...@gmail.com
 wrote:
  Hey guys,
 
  Trying to build Spark 1.3 for Scala 2.11.
 
  I'm running with the folllowng Maven command;
 
  -DskipTests -Dscala-2.11 clean install package
 
 
  Exception:
 
  [ERROR] Failed to execute goal on project spark-core_2.10: Could not
 resolve
  dependencies for project
  org.apache.spark:spark-core_2.10:jar:1.3.0-SNAPSHOT: The following
 artifacts
  could not be resolved:
  org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT,
  org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT: Failure
 to
  find org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT in
  http://repository.apache.org/snapshots was cached in the local
 repository,
  resolution will not be reattempted until the update interval of
  apache.snapshots has elapsed or updates are forced - [Help 1]
 
 
  I see these warnings in the log before this error:
 
 
  [INFO]
  [INFO]
  
  [INFO] Building Spark Project Core 1.3.0-SNAPSHOT
  [INFO]
  
  [WARNING] The POM for
  org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT is
 missing, no
  dependency information available
  [WARNING] The POM for
  org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT is
 missing,
  no dependency information available
 
 
  Any ideas?

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Building Spark 1.3 for Scala 2.11 using Maven

2015-03-06 Thread Sean Owen
-Pscala-2.11 and -Dscala-2.11 will happen to do the same thing for this profile.

Why are you running install package and not just install? Probably
doesn't matter.

This sounds like you are trying to only build core without building
everything else, which you can't do in general unless you already
built and installed these snapshot artifacts locally.

On Fri, Mar 6, 2015 at 12:46 AM, Night Wolf nightwolf...@gmail.com wrote:
 Hey guys,

 Trying to build Spark 1.3 for Scala 2.11.

 I'm running with the folllowng Maven command;

 -DskipTests -Dscala-2.11 clean install package


 Exception:

 [ERROR] Failed to execute goal on project spark-core_2.10: Could not resolve
 dependencies for project
 org.apache.spark:spark-core_2.10:jar:1.3.0-SNAPSHOT: The following artifacts
 could not be resolved:
 org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT,
 org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT: Failure to
 find org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT in
 http://repository.apache.org/snapshots was cached in the local repository,
 resolution will not be reattempted until the update interval of
 apache.snapshots has elapsed or updates are forced - [Help 1]


 I see these warnings in the log before this error:


 [INFO]
 [INFO]
 
 [INFO] Building Spark Project Core 1.3.0-SNAPSHOT
 [INFO]
 
 [WARNING] The POM for
 org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT is missing, no
 dependency information available
 [WARNING] The POM for
 org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT is missing,
 no dependency information available


 Any ideas?

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Building Spark 1.3 for Scala 2.11 using Maven

2015-03-05 Thread Night Wolf
Hey guys,

Trying to build Spark 1.3 for Scala 2.11.

I'm running with the folllowng Maven command;

-DskipTests -Dscala-2.11 clean install package


*Exception*:

[ERROR] Failed to execute goal on project spark-core_2.10: Could not
resolve dependencies for project
org.apache.spark:spark-core_2.10:jar:1.3.0-SNAPSHOT: The following
artifacts could not be resolved:
org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT,
org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT:
Failure to find
org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT in
http://repository.apache.org/snapshots was cached in the local
repository, resolution will not be reattempted until the update
interval of apache.snapshots has elapsed or updates are forced -
[Help 1]


I see these warnings in the log before this error:


[INFO]
[INFO] 
[INFO] Building Spark Project Core 1.3.0-SNAPSHOT
[INFO] 
[WARNING]
The POM for org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT
is missing, no dependency information available[WARNING] The POM for
org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT is
missing, no dependency information available


Any ideas?


Re: Building Spark 1.3 for Scala 2.11 using Maven

2015-03-05 Thread Marcelo Vanzin
I've never tried it, but I'm pretty sure in the very least you want
-Pscala-2.11 (not -D).

On Thu, Mar 5, 2015 at 4:46 PM, Night Wolf nightwolf...@gmail.com wrote:
 Hey guys,

 Trying to build Spark 1.3 for Scala 2.11.

 I'm running with the folllowng Maven command;

 -DskipTests -Dscala-2.11 clean install package


 Exception:

 [ERROR] Failed to execute goal on project spark-core_2.10: Could not resolve
 dependencies for project
 org.apache.spark:spark-core_2.10:jar:1.3.0-SNAPSHOT: The following artifacts
 could not be resolved:
 org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT,
 org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT: Failure to
 find org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT in
 http://repository.apache.org/snapshots was cached in the local repository,
 resolution will not be reattempted until the update interval of
 apache.snapshots has elapsed or updates are forced - [Help 1]


 I see these warnings in the log before this error:


 [INFO]
 [INFO]
 
 [INFO] Building Spark Project Core 1.3.0-SNAPSHOT
 [INFO]
 
 [WARNING] The POM for
 org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT is missing, no
 dependency information available
 [WARNING] The POM for
 org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT is missing,
 no dependency information available


 Any ideas?



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Building Spark 1.3 for Scala 2.11 using Maven

2015-03-05 Thread Marcelo Vanzin
Ah, and you may have to use dev/change-version-to-2.11.sh. (Again,
never tried compiling with scala 2.11.)

On Thu, Mar 5, 2015 at 4:52 PM, Marcelo Vanzin van...@cloudera.com wrote:
 I've never tried it, but I'm pretty sure in the very least you want
 -Pscala-2.11 (not -D).

 On Thu, Mar 5, 2015 at 4:46 PM, Night Wolf nightwolf...@gmail.com wrote:
 Hey guys,

 Trying to build Spark 1.3 for Scala 2.11.

 I'm running with the folllowng Maven command;

 -DskipTests -Dscala-2.11 clean install package


 Exception:

 [ERROR] Failed to execute goal on project spark-core_2.10: Could not resolve
 dependencies for project
 org.apache.spark:spark-core_2.10:jar:1.3.0-SNAPSHOT: The following artifacts
 could not be resolved:
 org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT,
 org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT: Failure to
 find org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT in
 http://repository.apache.org/snapshots was cached in the local repository,
 resolution will not be reattempted until the update interval of
 apache.snapshots has elapsed or updates are forced - [Help 1]


 I see these warnings in the log before this error:


 [INFO]
 [INFO]
 
 [INFO] Building Spark Project Core 1.3.0-SNAPSHOT
 [INFO]
 
 [WARNING] The POM for
 org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT is missing, no
 dependency information available
 [WARNING] The POM for
 org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT is missing,
 no dependency information available


 Any ideas?



 --
 Marcelo



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Spark 1.2.1: ClassNotFoundException when running hello world example in scala 2.11

2015-02-19 Thread Luis Solano
I'm having an issue with spark 1.2.1 and scala 2.11. I detailed the
symptoms in this stackoverflow question.

http://stackoverflow.com/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11

Has anyone experienced anything similar?

Thank you!


Re: Spark 1.2.1: ClassNotFoundException when running hello world example in scala 2.11

2015-02-19 Thread Akhil Das
Can you downgrade your scala dependency to 2.10 and give it a try?

Thanks
Best Regards

On Fri, Feb 20, 2015 at 12:40 AM, Luis Solano l...@pixable.com wrote:

 I'm having an issue with spark 1.2.1 and scala 2.11. I detailed the
 symptoms in this stackoverflow question.


 http://stackoverflow.com/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11

 Has anyone experienced anything similar?

 Thank you!



Re: maven doesn't build dependencies with Scala 2.11

2015-02-05 Thread Ted Yu
Now that Kafka 0.8.2.0 has been released, adding external/kafka module
works.

FYI

On Sun, Jan 18, 2015 at 7:36 PM, Ted Yu yuzhih...@gmail.com wrote:

 bq. there was no 2.11 Kafka available

 That's right. Adding external/kafka module resulted in:

 [ERROR] Failed to execute goal on project spark-streaming-kafka_2.11:
 Could not resolve dependencies for project
 org.apache.spark:spark-streaming-kafka_2.11:jar:1.3.0-SNAPSHOT: Could not
 find artifact org.apache.kafka:kafka_2.11:jar:0.8.0 in central (
 https://repo1.maven.org/maven2) - [Help 1]

 Cheers

 On Sun, Jan 18, 2015 at 10:41 AM, Sean Owen so...@cloudera.com wrote:

 I could be wrong, but I thought this was on purpose. At the time it
 was set up, there was no 2.11 Kafka available? or one of its
 dependencies wouldn't work with 2.11?

 But I'm not sure what the OP means by maven doesn't build Spark's
 dependencies because Ted indicates it does, and of course you can see
 that these artifacts are published.

 On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu yuzhih...@gmail.com wrote:
  There're 3 jars under lib_managed/jars directory with and without
  -Dscala-2.11 flag.
 
  Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
  profile has the following:
modules
  moduleexternal/kafka/module
/modules
 
  FYI
 
  On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu yuzhih...@gmail.com wrote:
 
  I did the following:
   1655  dev/change-version-to-2.11.sh
   1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
  -Dscala-2.11 -DskipTests clean package
 
  And mvn command passed.
 
  Did you see any cross-compilation errors ?
 
  Cheers
 
  BTW the two links you mentioned are consistent in terms of building for
  Scala 2.11
 
  On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
 
  wrote:
 
  Hi,
 
  When I run this:
 
  dev/change-version-to-2.11.sh
  mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
 
  as per here, maven doesn't build Spark's dependencies.
 
  Only when I run:
 
  dev/change-version-to-2.11.sh
  sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package
 
  as gathered from here, do I get Spark's dependencies built without any
  cross-compilation errors.
 
  Question:
 
  - How can I make maven do this?
 
  - How can I specify the use of Scala 2.11 in my own .pom files?
 
  Thanks
 
 
 





Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-31 Thread Ted Yu
Looking at https://github.com/apache/spark/pull/1222/files , the following
change may have caused what Stephen described:

+ if (!fileSystem.isDirectory(new Path(logBaseDir))) {

When there is no schema associated with logBaseDir, local path should be
assumed.

On Fri, Jan 30, 2015 at 8:37 AM, Stephen Haberman 
stephen.haber...@gmail.com wrote:

 Hi Krishna/all,

 I think I found it, and it wasn't related to Scala-2.11...

 I had spark.eventLog.dir=/mnt/spark/work/history, which worked
 in Spark 1.2, but now am running Spark master, and it wants a
 Hadoop URI, e.g. file:///mnt/spark/work/history (I believe due to
 commit 45645191).

 This looks like a breaking change to the spark.eventLog.dir
 config property.

 Perhaps it should be patched to convert the previously supported
 just a file path values to HDFS-compatible file://... URIs
 for backwards compatibility?

 - Stephen


 On Wed, 28 Jan 2015 12:27:17 -0800
 Krishna Sankar ksanka...@gmail.com wrote:

  Stephen,
 Scala 2.11 worked fine for me. Did the dev change and then
  compile. Not using in production, but I go back and forth
  between 2.10  2.11. Cheers
  k/
 
  On Wed, Jan 28, 2015 at 12:18 PM, Stephen Haberman 
  stephen.haber...@gmail.com wrote:
 
   Hey,
  
   I recently compiled Spark master against scala-2.11 (by
   running the dev/change-versions script), but when I run
   spark-shell, it looks like the sc variable is missing.
  
   Is this a known/unknown issue? Are others successfully using
   Spark with scala-2.11, and specifically spark-shell?
  
   It is possible I did something dumb while compiling master,
   but I'm not sure what it would be.
  
   Thanks,
   Stephen
  
   -
   To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
   For additional commands, e-mail: user-h...@spark.apache.org
  
  


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-31 Thread Stephen Haberman

 Looking at https://github.com/apache/spark/pull/1222/files ,
 the following change may have caused what Stephen described:
 
 + if (!fileSystem.isDirectory(new Path(logBaseDir))) {
 
 When there is no schema associated with logBaseDir, local path
 should be assumed.

Yes, that looks right. In branch-1.2, it looks like:

logDir goes through getLogDirPath:

https://github.com/apache/spark/blob/branch-1.2/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala#L61

Which calls resolveUri:

https://github.com/apache/spark/blob/branch-1.2/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala#L185

Which prepends the file scheme if needed:

https://github.com/apache/spark/blob/branch-1.2/core/src/main/scala/org/apache/spark/util/Utils.scala#L1588

So, raw/scheme-less /some/dir paths were previously supported.

However, now on master, logBaseDir is passed to
getHadoopFileSystem directly:

https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala#L64

Note that getLogDirPath was renamed to just getLogPath:

https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala#L264

But, right, per Ted Yu's comment, it's not used when creating
the file system.

I'd file a pull request but Eclipse/maven/etc. is being dumb.

Can someone/a regular spark dev pick this up? Or else I can keep
fighting Eclipse/m2e for awhile.

- Stephen


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-31 Thread Ted Yu
Understood.

However the previous default was local directory. Now user has to specify
file:// scheme.

Maybe add release note to SPARK-2261 ?

Cheers

On Sat, Jan 31, 2015 at 8:40 AM, Sean Owen so...@cloudera.com wrote:

 This might have been on purpose, since the goal is to make this
 HDFS-friendly, and of course still allow local directories. With no
 scheme, a path is ambiguous.

 On Sat, Jan 31, 2015 at 4:18 PM, Ted Yu yuzhih...@gmail.com wrote:
  Looking at https://github.com/apache/spark/pull/1222/files , the
 following
  change may have caused what Stephen described:
 
  + if (!fileSystem.isDirectory(new Path(logBaseDir))) {
 
  When there is no schema associated with logBaseDir, local path should be
  assumed.
 
  On Fri, Jan 30, 2015 at 8:37 AM, Stephen Haberman
  stephen.haber...@gmail.com wrote:
 
  Hi Krishna/all,
 
  I think I found it, and it wasn't related to Scala-2.11...
 
  I had spark.eventLog.dir=/mnt/spark/work/history, which worked
  in Spark 1.2, but now am running Spark master, and it wants a
  Hadoop URI, e.g. file:///mnt/spark/work/history (I believe due to
  commit 45645191).
 
  This looks like a breaking change to the spark.eventLog.dir
  config property.
 
  Perhaps it should be patched to convert the previously supported
  just a file path values to HDFS-compatible file://... URIs
  for backwards compatibility?
 
  - Stephen
 
 
  On Wed, 28 Jan 2015 12:27:17 -0800
  Krishna Sankar ksanka...@gmail.com wrote:
 
   Stephen,
  Scala 2.11 worked fine for me. Did the dev change and then
   compile. Not using in production, but I go back and forth
   between 2.10  2.11. Cheers
   k/
  
   On Wed, Jan 28, 2015 at 12:18 PM, Stephen Haberman 
   stephen.haber...@gmail.com wrote:
  
Hey,
   
I recently compiled Spark master against scala-2.11 (by
running the dev/change-versions script), but when I run
spark-shell, it looks like the sc variable is missing.
   
Is this a known/unknown issue? Are others successfully using
Spark with scala-2.11, and specifically spark-shell?
   
It is possible I did something dumb while compiling master,
but I'm not sure what it would be.
   
Thanks,
Stephen
   
   
 -
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
   
   
 
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 
 



Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-31 Thread Sean Owen
This might have been on purpose, since the goal is to make this
HDFS-friendly, and of course still allow local directories. With no
scheme, a path is ambiguous.

On Sat, Jan 31, 2015 at 4:18 PM, Ted Yu yuzhih...@gmail.com wrote:
 Looking at https://github.com/apache/spark/pull/1222/files , the following
 change may have caused what Stephen described:

 + if (!fileSystem.isDirectory(new Path(logBaseDir))) {

 When there is no schema associated with logBaseDir, local path should be
 assumed.

 On Fri, Jan 30, 2015 at 8:37 AM, Stephen Haberman
 stephen.haber...@gmail.com wrote:

 Hi Krishna/all,

 I think I found it, and it wasn't related to Scala-2.11...

 I had spark.eventLog.dir=/mnt/spark/work/history, which worked
 in Spark 1.2, but now am running Spark master, and it wants a
 Hadoop URI, e.g. file:///mnt/spark/work/history (I believe due to
 commit 45645191).

 This looks like a breaking change to the spark.eventLog.dir
 config property.

 Perhaps it should be patched to convert the previously supported
 just a file path values to HDFS-compatible file://... URIs
 for backwards compatibility?

 - Stephen


 On Wed, 28 Jan 2015 12:27:17 -0800
 Krishna Sankar ksanka...@gmail.com wrote:

  Stephen,
 Scala 2.11 worked fine for me. Did the dev change and then
  compile. Not using in production, but I go back and forth
  between 2.10  2.11. Cheers
  k/
 
  On Wed, Jan 28, 2015 at 12:18 PM, Stephen Haberman 
  stephen.haber...@gmail.com wrote:
 
   Hey,
  
   I recently compiled Spark master against scala-2.11 (by
   running the dev/change-versions script), but when I run
   spark-shell, it looks like the sc variable is missing.
  
   Is this a known/unknown issue? Are others successfully using
   Spark with scala-2.11, and specifically spark-shell?
  
   It is possible I did something dumb while compiling master,
   but I'm not sure what it would be.
  
   Thanks,
   Stephen
  
   -
   To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
   For additional commands, e-mail: user-h...@spark.apache.org
  
  


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org



-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-30 Thread Stephen Haberman
Hi Krishna/all,

I think I found it, and it wasn't related to Scala-2.11...

I had spark.eventLog.dir=/mnt/spark/work/history, which worked
in Spark 1.2, but now am running Spark master, and it wants a
Hadoop URI, e.g. file:///mnt/spark/work/history (I believe due to
commit 45645191).

This looks like a breaking change to the spark.eventLog.dir
config property.

Perhaps it should be patched to convert the previously supported
just a file path values to HDFS-compatible file://... URIs
for backwards compatibility?

- Stephen


On Wed, 28 Jan 2015 12:27:17 -0800
Krishna Sankar ksanka...@gmail.com wrote:

 Stephen,
Scala 2.11 worked fine for me. Did the dev change and then
 compile. Not using in production, but I go back and forth
 between 2.10  2.11. Cheers
 k/
 
 On Wed, Jan 28, 2015 at 12:18 PM, Stephen Haberman 
 stephen.haber...@gmail.com wrote:
 
  Hey,
 
  I recently compiled Spark master against scala-2.11 (by
  running the dev/change-versions script), but when I run
  spark-shell, it looks like the sc variable is missing.
 
  Is this a known/unknown issue? Are others successfully using
  Spark with scala-2.11, and specifically spark-shell?
 
  It is possible I did something dumb while compiling master,
  but I'm not sure what it would be.
 
  Thanks,
  Stephen
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 
 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: spark-shell working in scala-2.11

2015-01-28 Thread Krishna Sankar
Stephen,
   Scala 2.11 worked fine for me. Did the dev change and then compile. Not
using in production, but I go back and forth between 2.10  2.11.
Cheers
k/

On Wed, Jan 28, 2015 at 12:18 PM, Stephen Haberman 
stephen.haber...@gmail.com wrote:

 Hey,

 I recently compiled Spark master against scala-2.11 (by running
 the dev/change-versions script), but when I run spark-shell,
 it looks like the sc variable is missing.

 Is this a known/unknown issue? Are others successfully using
 Spark with scala-2.11, and specifically spark-shell?

 It is possible I did something dumb while compiling master,
 but I'm not sure what it would be.

 Thanks,
 Stephen

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




spark-shell working in scala-2.11

2015-01-28 Thread Stephen Haberman
Hey,

I recently compiled Spark master against scala-2.11 (by running
the dev/change-versions script), but when I run spark-shell,
it looks like the sc variable is missing.

Is this a known/unknown issue? Are others successfully using
Spark with scala-2.11, and specifically spark-shell?

It is possible I did something dumb while compiling master,
but I'm not sure what it would be.

Thanks,
Stephen

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: maven doesn't build dependencies with Scala 2.11

2015-01-18 Thread Ted Yu
bq. there was no 2.11 Kafka available

That's right. Adding external/kafka module resulted in:

[ERROR] Failed to execute goal on project spark-streaming-kafka_2.11: Could
not resolve dependencies for project
org.apache.spark:spark-streaming-kafka_2.11:jar:1.3.0-SNAPSHOT: Could not
find artifact org.apache.kafka:kafka_2.11:jar:0.8.0 in central (
https://repo1.maven.org/maven2) - [Help 1]

Cheers

On Sun, Jan 18, 2015 at 10:41 AM, Sean Owen so...@cloudera.com wrote:

 I could be wrong, but I thought this was on purpose. At the time it
 was set up, there was no 2.11 Kafka available? or one of its
 dependencies wouldn't work with 2.11?

 But I'm not sure what the OP means by maven doesn't build Spark's
 dependencies because Ted indicates it does, and of course you can see
 that these artifacts are published.

 On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu yuzhih...@gmail.com wrote:
  There're 3 jars under lib_managed/jars directory with and without
  -Dscala-2.11 flag.
 
  Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
  profile has the following:
modules
  moduleexternal/kafka/module
/modules
 
  FYI
 
  On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu yuzhih...@gmail.com wrote:
 
  I did the following:
   1655  dev/change-version-to-2.11.sh
   1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
  -Dscala-2.11 -DskipTests clean package
 
  And mvn command passed.
 
  Did you see any cross-compilation errors ?
 
  Cheers
 
  BTW the two links you mentioned are consistent in terms of building for
  Scala 2.11
 
  On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
  wrote:
 
  Hi,
 
  When I run this:
 
  dev/change-version-to-2.11.sh
  mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
 
  as per here, maven doesn't build Spark's dependencies.
 
  Only when I run:
 
  dev/change-version-to-2.11.sh
  sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package
 
  as gathered from here, do I get Spark's dependencies built without any
  cross-compilation errors.
 
  Question:
 
  - How can I make maven do this?
 
  - How can I specify the use of Scala 2.11 in my own .pom files?
 
  Thanks
 
 
 



Re: maven doesn't build dependencies with Scala 2.11

2015-01-18 Thread Sean Owen
I could be wrong, but I thought this was on purpose. At the time it
was set up, there was no 2.11 Kafka available? or one of its
dependencies wouldn't work with 2.11?

But I'm not sure what the OP means by maven doesn't build Spark's
dependencies because Ted indicates it does, and of course you can see
that these artifacts are published.

On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu yuzhih...@gmail.com wrote:
 There're 3 jars under lib_managed/jars directory with and without
 -Dscala-2.11 flag.

 Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
 profile has the following:
   modules
 moduleexternal/kafka/module
   /modules

 FYI

 On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu yuzhih...@gmail.com wrote:

 I did the following:
  1655  dev/change-version-to-2.11.sh
  1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
 -Dscala-2.11 -DskipTests clean package

 And mvn command passed.

 Did you see any cross-compilation errors ?

 Cheers

 BTW the two links you mentioned are consistent in terms of building for
 Scala 2.11

 On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
 wrote:

 Hi,

 When I run this:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

 as per here, maven doesn't build Spark's dependencies.

 Only when I run:

 dev/change-version-to-2.11.sh
 sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package

 as gathered from here, do I get Spark's dependencies built without any
 cross-compilation errors.

 Question:

 - How can I make maven do this?

 - How can I specify the use of Scala 2.11 in my own .pom files?

 Thanks




-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



maven doesn't build dependencies with Scala 2.11

2015-01-17 Thread Walrus theCat
Hi,

When I run this:

dev/change-version-to-2.11.sh
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

as per here
https://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211,
maven doesn't build Spark's dependencies.

Only when I run:

dev/change-version-to-2.11.sh
sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package

as gathered from here
https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md,
do I get Spark's dependencies built without any cross-compilation
errors.

*Question*:

- How can I make maven do this?

- How can I specify the use of Scala 2.11 in my own .pom files?

Thanks


Re: maven doesn't build dependencies with Scala 2.11

2015-01-17 Thread Ted Yu
There're 3 jars under lib_managed/jars directory with and
without -Dscala-2.11 flag.

Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
profile has the following:
  modules
moduleexternal/kafka/module
  /modules

FYI

On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu yuzhih...@gmail.com wrote:

 I did the following:
  1655  dev/change-version-to-2.11.sh
  1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
 -Dscala-2.11 -DskipTests clean package

 And mvn command passed.

 Did you see any cross-compilation errors ?

 Cheers

 BTW the two links you mentioned are consistent in terms of building for
 Scala 2.11

 On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
 wrote:

 Hi,

 When I run this:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

 as per here
 https://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211,
 maven doesn't build Spark's dependencies.

 Only when I run:

 dev/change-version-to-2.11.sh
 sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package

 as gathered from here 
 https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md, 
 do I get Spark's dependencies built without any cross-compilation errors.

 *Question*:

 - How can I make maven do this?

 - How can I specify the use of Scala 2.11 in my own .pom files?

 Thanks





Re: maven doesn't build dependencies with Scala 2.11

2015-01-17 Thread Ted Yu
I did the following:
 1655  dev/change-version-to-2.11.sh
 1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
-Dscala-2.11 -DskipTests clean package

And mvn command passed.

Did you see any cross-compilation errors ?

Cheers

BTW the two links you mentioned are consistent in terms of building for
Scala 2.11

On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
wrote:

 Hi,

 When I run this:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

 as per here
 https://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211,
 maven doesn't build Spark's dependencies.

 Only when I run:

 dev/change-version-to-2.11.sh
 sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package

 as gathered from here 
 https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md, 
 do I get Spark's dependencies built without any cross-compilation errors.

 *Question*:

 - How can I make maven do this?

 - How can I specify the use of Scala 2.11 in my own .pom files?

 Thanks




Does Spark 1.2.0 support Scala 2.11?

2014-12-19 Thread Jonathan Chayat
The following ticket:

https://issues.apache.org/jira/browse/SPARK-1812

for supporting 2.11 have been marked as fixed in 1.2,
but the docs in the Spark site still say that 2.10 is required.

Thanks,
Jon


Re: Does Spark 1.2.0 support Scala 2.11?

2014-12-19 Thread Gerard Maas
Check out the 'compiling for Scala 2.11'  instructions:

http://spark.apache.org/docs/1.2.0/building-spark.html#building-for-scala-211

-kr, Gerard.

On Fri, Dec 19, 2014 at 12:00 PM, Jonathan Chayat jonatha...@supersonic.com
 wrote:

 The following ticket:

 https://issues.apache.org/jira/browse/SPARK-1812

 for supporting 2.11 have been marked as fixed in 1.2,
 but the docs in the Spark site still say that 2.10 is required.

 Thanks,
 Jon



Re: Does Spark 1.2.0 support Scala 2.11?

2014-12-19 Thread Sean Owen
You might interpret that as 2.10+. Although 2.10 is still the main
version in use, I think, you can see 2.11 artifacts have been
published: 
http://search.maven.org/#artifactdetails%7Corg.apache.spark%7Cspark-core_2.11%7C1.2.0%7Cjar

On Fri, Dec 19, 2014 at 11:00 AM, Jonathan Chayat
jonatha...@supersonic.com wrote:
 The following ticket:

 https://issues.apache.org/jira/browse/SPARK-1812

 for supporting 2.11 have been marked as fixed in 1.2,
 but the docs in the Spark site still say that 2.10 is required.

 Thanks,
 Jon

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-18 Thread Jianshi Huang
Ok, I'll wait until -Pscala-2.11 is more stable and used by more people.

Thanks for the help!

Jianshi

On Tue, Nov 18, 2014 at 3:49 PM, Ye Xianjin advance...@gmail.com wrote:

 Hi Prashant Sharma,

 It's not even ok to build with scala-2.11 profile on my machine.

 Just check out the master(c6e0c2ab1c29c184a9302d23ad75e4ccd8060242)
 run sbt/sbt -Pscala-2.11 clean assembly:

 .. skip the normal part
 info] Resolving org.scalamacros#quasiquotes_2.11;2.0.1 ...
 [warn] module not found: org.scalamacros#quasiquotes_2.11;2.0.1
 [warn]  local: tried
 [warn]
 /Users/yexianjin/.ivy2/local/org.scalamacros/quasiquotes_2.11/2.0.1/ivys/ivy.xml
 [warn]  public: tried
 [warn]
 https://repo1.maven.org/maven2/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  central: tried
 [warn]
 https://repo1.maven.org/maven2/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  apache-repo: tried
 [warn]
 https://repository.apache.org/content/repositories/releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  jboss-repo: tried
 [warn]
 https://repository.jboss.org/nexus/content/repositories/releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  mqtt-repo: tried
 [warn]
 https://repo.eclipse.org/content/repositories/paho-releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  cloudera-repo: tried
 [warn]
 https://repository.cloudera.com/artifactory/cloudera-repos/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  mapr-repo: tried
 [warn]
 http://repository.mapr.com/maven/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  spring-releases: tried
 [warn]
 https://repo.spring.io/libs-release/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  spark-staging: tried
 [warn]
 https://oss.sonatype.org/content/repositories/orgspark-project-1085/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  spark-staging-hive13: tried
 [warn]
 https://oss.sonatype.org/content/repositories/orgspark-project-1089/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  apache.snapshots: tried
 [warn]
 http://repository.apache.org/snapshots/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  Maven2 Local: tried
 [warn]
 file:/Users/yexianjin/.m2/repository/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [info] Resolving jline#jline;2.12 ...
 [warn] ::
 [warn] ::  UNRESOLVED DEPENDENCIES ::
 [warn] ::
 [warn] :: org.scalamacros#quasiquotes_2.11;2.0.1: not found
 [warn] ::
 [info] Resolving org.scala-lang#scala-library;2.11.2 ...
 [warn]
 [warn] Note: Unresolved dependencies path:
 [warn] org.scalamacros:quasiquotes_2.11:2.0.1
 ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
 [warn]   +- org.apache.spark:spark-catalyst_2.11:1.2.0-SNAPSHOT
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Updating {file:/Users/yexianjin/spark/}streaming-twitter...
 [info] Updating {file:/Users/yexianjin/spark/}streaming-zeromq...
 [info] Updating {file:/Users/yexianjin/spark/}streaming-flume...
 [info] Updating {file:/Users/yexianjin/spark/}streaming-mqtt...
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Resolving com.esotericsoftware.minlog#minlog;1.2 ...
 [info] Updating {file:/Users/yexianjin/spark/}streaming-kafka...
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Resolving org.apache.kafka#kafka_2.11;0.8.0 ...
 [warn] module not found: org.apache.kafka#kafka_2.11;0.8.0
 [warn]  local: tried
 [warn]
 /Users/yexianjin/.ivy2/local/org.apache.kafka/kafka_2.11/0.8.0/ivys/ivy.xml
 [warn]  public: tried
 [warn]
 https://repo1.maven.org/maven2/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  central: tried
 [warn]
 https://repo1.maven.org/maven2/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  apache-repo: tried
 [warn]
 https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  jboss-repo: tried
 [warn]
 https://repository.jboss.org/nexus/content/repositories/releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  mqtt-repo: tried
 [warn]
 https://repo.eclipse.org/content/repositories/paho-releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  cloudera-repo: tried
 [warn]
 https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  mapr-repo: tried
 [warn]
 http://repository.mapr.com

Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Jianshi Huang
Any notable issues for using Scala 2.11? Is it stable now?

Or can I use Scala 2.11 in my spark application and use Spark dist build
with 2.10 ?

I'm looking forward to migrate to 2.11 for some quasiquote features.
Couldn't make it run in 2.10...

Cheers,
-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github  Blog: http://huangjs.github.com/


Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Prashant Sharma
It is safe in the sense we would help you with the fix if you run into
issues. I have used it, but since I worked on the patch the opinion can be
biased. I am using scala 2.11 for day to day development. You should
checkout the build instructions here :
https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md

Prashant Sharma



On Tue, Nov 18, 2014 at 12:19 PM, Jianshi Huang jianshi.hu...@gmail.com
wrote:

 Any notable issues for using Scala 2.11? Is it stable now?

 Or can I use Scala 2.11 in my spark application and use Spark dist build
 with 2.10 ?

 I'm looking forward to migrate to 2.11 for some quasiquote features.
 Couldn't make it run in 2.10...

 Cheers,
 --
 Jianshi Huang

 LinkedIn: jianshi
 Twitter: @jshuang
 Github  Blog: http://huangjs.github.com/



  1   2   >