Re: Spark 1.4 RDD to DF fails with toDF()

2016-01-04 Thread Fab
Good catch, thanks. Things work now after changing the version.

For reference, I got the" 2.11" version from my separate download of Scala:
$ scala
Welcome to Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java
1.7.0_67).

But my Spark version is indeed running Scala "2.10":
park-shell
16/01/04 12:47:18 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Welcome to
    __
 / __/__  ___ _/ /__
_\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.5.2
  /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
1.7.0_67)

After changing my sbt file, I could finally submit my scala code to spark.
Thanks!


On Mon, Jan 4, 2016 at 12:26 PM, nraychaudhuri [via Apache Spark User List]
 wrote:

> Is the Spark distribution you are using built with scala 2.11? I think the
> default one is built using scala 2.10.
>
> Nilanjan
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499p25874.html
> To unsubscribe from Spark 1.4 RDD to DF fails with toDF(), click here
> 
> .
> NAML
> 
>



-- 


*Fabrice Guillaume*VP of Engineering
CircleBack, Inc.

Office   +1 (703) 520- x755
Cell  +1 (703) 928-1851
Email   fabr...@circleback.com
Download CircleBack for iOS  & Android





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499p25875.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Spark 1.4 RDD to DF fails with toDF()

2015-09-08 Thread Gheorghe Postelnicu
Good point. It is a pre-compiled Spark version. Based on the text on the
downloads page, the answer to your question is no, so I will download the
sources and recompile.

Thanks!

On Tue, Sep 8, 2015 at 5:17 AM, Koert Kuipers  wrote:

> is /opt/spark-1.4.1-bin-hadoop2.6 a spark version compiled with scala
> 2.11?
>
> On Mon, Sep 7, 2015 at 5:29 PM, Gheorghe Postelnicu <
> gheorghe.posteln...@gmail.com> wrote:
>
>> sbt assembly; $SPARK_HOME/bin/spark-submit --class main.scala.TestMain
>> --master "local[4]" target/scala-2.11/bof-assembly-0.1-SNAPSHOT.jar
>>
>> using Spark:
>>
>> /opt/spark-1.4.1-bin-hadoop2.6
>>
>> On Mon, Sep 7, 2015 at 10:20 PM, Jonathan Coveney 
>> wrote:
>>
>>> How are you building and running it?
>>>
>>>
>>> El lunes, 7 de septiembre de 2015, Gheorghe Postelnicu <
>>> gheorghe.posteln...@gmail.com> escribió:
>>>
 Interesting idea. Tried that, didn't work. Here is my new SBT file:

 name := """testMain"""

 scalaVersion := "2.11.6"

 libraryDependencies ++= Seq(
   "org.apache.spark" %% "spark-core" % "1.4.1" % "provided",
   "org.apache.spark" %% "spark-sql" % "1.4.1" % "provided",
   "org.scala-lang" % "scala-reflect" % "2.11.6"
 )


 On Mon, Sep 7, 2015 at 9:55 PM, Jonathan Coveney 
 wrote:

> Try adding the following to your build.sbt
>
> libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"
>
>
> I believe that spark shades the scala library, and this is a library that 
> it looks like you need in an unshaded way.
>
>
> 2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <
> gheorghe.posteln...@gmail.com>:
>
>> Hi,
>>
>> The following code fails when compiled from SBT:
>>
>> package main.scala
>>
>> import org.apache.spark.SparkContext
>> import org.apache.spark.sql.SQLContext
>>
>> object TestMain {
>>   def main(args: Array[String]): Unit = {
>> implicit val sparkContext = new SparkContext()
>> val sqlContext = new SQLContext(sparkContext)
>> import sqlContext.implicits._
>> sparkContext.parallelize(1 to 10).map(i => (i,
>> i.toString)).toDF("intCol", "strCol")
>>   }
>> }
>>
>> with the following error:
>>
>> 15/09/07 21:39:21 INFO BlockManagerMaster: Registered BlockManager
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
>> at main.scala.Bof$.main(Bof.scala:14)
>> at main.scala.Bof.main(Bof.scala)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:497)
>> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
>> at
>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> 15/09/07 21:39:22 INFO SparkContext: Invoking stop() from shutdown
>> hook
>>
>> whereas the code above works in a spark shell.
>>
>> The code is compiled using Scala 2.11.6 and precompiled Spark 1.4.1
>>
>> Any suggestion on how to fix this would be much appreciated.
>>
>> Best,
>> Gheorghe
>>
>>
>

>>
>


Re: Spark 1.4 RDD to DF fails with toDF()

2015-09-08 Thread Gheorghe Postelnicu
Compiling from source with Scala 2.11 support fixed this issue. Thanks
again for the help!



On Tue, Sep 8, 2015 at 7:33 AM, Gheorghe Postelnicu <
gheorghe.posteln...@gmail.com> wrote:

> Good point. It is a pre-compiled Spark version. Based on the text on the
> downloads page, the answer to your question is no, so I will download the
> sources and recompile.
>
> Thanks!
>
> On Tue, Sep 8, 2015 at 5:17 AM, Koert Kuipers  wrote:
>
>> is /opt/spark-1.4.1-bin-hadoop2.6 a spark version compiled with scala
>> 2.11?
>>
>> On Mon, Sep 7, 2015 at 5:29 PM, Gheorghe Postelnicu <
>> gheorghe.posteln...@gmail.com> wrote:
>>
>>> sbt assembly; $SPARK_HOME/bin/spark-submit --class main.scala.TestMain
>>> --master "local[4]" target/scala-2.11/bof-assembly-0.1-SNAPSHOT.jar
>>>
>>> using Spark:
>>>
>>> /opt/spark-1.4.1-bin-hadoop2.6
>>>
>>> On Mon, Sep 7, 2015 at 10:20 PM, Jonathan Coveney 
>>> wrote:
>>>
 How are you building and running it?


 El lunes, 7 de septiembre de 2015, Gheorghe Postelnicu <
 gheorghe.posteln...@gmail.com> escribió:

> Interesting idea. Tried that, didn't work. Here is my new SBT file:
>
> name := """testMain"""
>
> scalaVersion := "2.11.6"
>
> libraryDependencies ++= Seq(
>   "org.apache.spark" %% "spark-core" % "1.4.1" % "provided",
>   "org.apache.spark" %% "spark-sql" % "1.4.1" % "provided",
>   "org.scala-lang" % "scala-reflect" % "2.11.6"
> )
>
>
> On Mon, Sep 7, 2015 at 9:55 PM, Jonathan Coveney 
> wrote:
>
>> Try adding the following to your build.sbt
>>
>> libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"
>>
>>
>> I believe that spark shades the scala library, and this is a library 
>> that it looks like you need in an unshaded way.
>>
>>
>> 2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <
>> gheorghe.posteln...@gmail.com>:
>>
>>> Hi,
>>>
>>> The following code fails when compiled from SBT:
>>>
>>> package main.scala
>>>
>>> import org.apache.spark.SparkContext
>>> import org.apache.spark.sql.SQLContext
>>>
>>> object TestMain {
>>>   def main(args: Array[String]): Unit = {
>>> implicit val sparkContext = new SparkContext()
>>> val sqlContext = new SQLContext(sparkContext)
>>> import sqlContext.implicits._
>>> sparkContext.parallelize(1 to 10).map(i => (i,
>>> i.toString)).toDF("intCol", "strCol")
>>>   }
>>> }
>>>
>>> with the following error:
>>>
>>> 15/09/07 21:39:21 INFO BlockManagerMaster: Registered BlockManager
>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
>>> at main.scala.Bof$.main(Bof.scala:14)
>>> at main.scala.Bof.main(Bof.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> 15/09/07 21:39:22 INFO SparkContext: Invoking stop() from shutdown
>>> hook
>>>
>>> whereas the code above works in a spark shell.
>>>
>>> The code is compiled using Scala 2.11.6 and precompiled Spark 1.4.1
>>>
>>> Any suggestion on how to fix this would be much appreciated.
>>>
>>> Best,
>>> Gheorghe
>>>
>>>
>>
>
>>>
>>
>


Re: Spark 1.4 RDD to DF fails with toDF()

2015-09-07 Thread Jonathan Coveney
Try adding the following to your build.sbt

libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"


I believe that spark shades the scala library, and this is a library
that it looks like you need in an unshaded way.


2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <
gheorghe.posteln...@gmail.com>:

> Hi,
>
> The following code fails when compiled from SBT:
>
> package main.scala
>
> import org.apache.spark.SparkContext
> import org.apache.spark.sql.SQLContext
>
> object TestMain {
>   def main(args: Array[String]): Unit = {
> implicit val sparkContext = new SparkContext()
> val sqlContext = new SQLContext(sparkContext)
> import sqlContext.implicits._
> sparkContext.parallelize(1 to 10).map(i => (i,
> i.toString)).toDF("intCol", "strCol")
>   }
> }
>
> with the following error:
>
> 15/09/07 21:39:21 INFO BlockManagerMaster: Registered BlockManager
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
> at main.scala.Bof$.main(Bof.scala:14)
> at main.scala.Bof.main(Bof.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 15/09/07 21:39:22 INFO SparkContext: Invoking stop() from shutdown hook
>
> whereas the code above works in a spark shell.
>
> The code is compiled using Scala 2.11.6 and precompiled Spark 1.4.1
>
> Any suggestion on how to fix this would be much appreciated.
>
> Best,
> Gheorghe
>
>


Re: Spark 1.4 RDD to DF fails with toDF()

2015-09-07 Thread Jonathan Coveney
How are you building and running it?

El lunes, 7 de septiembre de 2015, Gheorghe Postelnicu <
gheorghe.posteln...@gmail.com> escribió:

> Interesting idea. Tried that, didn't work. Here is my new SBT file:
>
> name := """testMain"""
>
> scalaVersion := "2.11.6"
>
> libraryDependencies ++= Seq(
>   "org.apache.spark" %% "spark-core" % "1.4.1" % "provided",
>   "org.apache.spark" %% "spark-sql" % "1.4.1" % "provided",
>   "org.scala-lang" % "scala-reflect" % "2.11.6"
> )
>
>
> On Mon, Sep 7, 2015 at 9:55 PM, Jonathan Coveney  > wrote:
>
>> Try adding the following to your build.sbt
>>
>> libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"
>>
>>
>> I believe that spark shades the scala library, and this is a library that it 
>> looks like you need in an unshaded way.
>>
>>
>> 2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <
>> gheorghe.posteln...@gmail.com
>> >:
>>
>>> Hi,
>>>
>>> The following code fails when compiled from SBT:
>>>
>>> package main.scala
>>>
>>> import org.apache.spark.SparkContext
>>> import org.apache.spark.sql.SQLContext
>>>
>>> object TestMain {
>>>   def main(args: Array[String]): Unit = {
>>> implicit val sparkContext = new SparkContext()
>>> val sqlContext = new SQLContext(sparkContext)
>>> import sqlContext.implicits._
>>> sparkContext.parallelize(1 to 10).map(i => (i,
>>> i.toString)).toDF("intCol", "strCol")
>>>   }
>>> }
>>>
>>> with the following error:
>>>
>>> 15/09/07 21:39:21 INFO BlockManagerMaster: Registered BlockManager
>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
>>> at main.scala.Bof$.main(Bof.scala:14)
>>> at main.scala.Bof.main(Bof.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> 15/09/07 21:39:22 INFO SparkContext: Invoking stop() from shutdown hook
>>>
>>> whereas the code above works in a spark shell.
>>>
>>> The code is compiled using Scala 2.11.6 and precompiled Spark 1.4.1
>>>
>>> Any suggestion on how to fix this would be much appreciated.
>>>
>>> Best,
>>> Gheorghe
>>>
>>>
>>
>


Re: Spark 1.4 RDD to DF fails with toDF()

2015-09-07 Thread Gheorghe Postelnicu
sbt assembly; $SPARK_HOME/bin/spark-submit --class main.scala.TestMain
--master "local[4]" target/scala-2.11/bof-assembly-0.1-SNAPSHOT.jar

using Spark:

/opt/spark-1.4.1-bin-hadoop2.6

On Mon, Sep 7, 2015 at 10:20 PM, Jonathan Coveney 
wrote:

> How are you building and running it?
>
>
> El lunes, 7 de septiembre de 2015, Gheorghe Postelnicu <
> gheorghe.posteln...@gmail.com> escribió:
>
>> Interesting idea. Tried that, didn't work. Here is my new SBT file:
>>
>> name := """testMain"""
>>
>> scalaVersion := "2.11.6"
>>
>> libraryDependencies ++= Seq(
>>   "org.apache.spark" %% "spark-core" % "1.4.1" % "provided",
>>   "org.apache.spark" %% "spark-sql" % "1.4.1" % "provided",
>>   "org.scala-lang" % "scala-reflect" % "2.11.6"
>> )
>>
>>
>> On Mon, Sep 7, 2015 at 9:55 PM, Jonathan Coveney 
>> wrote:
>>
>>> Try adding the following to your build.sbt
>>>
>>> libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"
>>>
>>>
>>> I believe that spark shades the scala library, and this is a library that 
>>> it looks like you need in an unshaded way.
>>>
>>>
>>> 2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <
>>> gheorghe.posteln...@gmail.com>:
>>>
 Hi,

 The following code fails when compiled from SBT:

 package main.scala

 import org.apache.spark.SparkContext
 import org.apache.spark.sql.SQLContext

 object TestMain {
   def main(args: Array[String]): Unit = {
 implicit val sparkContext = new SparkContext()
 val sqlContext = new SQLContext(sparkContext)
 import sqlContext.implicits._
 sparkContext.parallelize(1 to 10).map(i => (i,
 i.toString)).toDF("intCol", "strCol")
   }
 }

 with the following error:

 15/09/07 21:39:21 INFO BlockManagerMaster: Registered BlockManager
 Exception in thread "main" java.lang.NoSuchMethodError:
 scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
 at main.scala.Bof$.main(Bof.scala:14)
 at main.scala.Bof.main(Bof.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:497)
 at
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
 at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 15/09/07 21:39:22 INFO SparkContext: Invoking stop() from shutdown hook

 whereas the code above works in a spark shell.

 The code is compiled using Scala 2.11.6 and precompiled Spark 1.4.1

 Any suggestion on how to fix this would be much appreciated.

 Best,
 Gheorghe


>>>
>>


Re: Spark 1.4 RDD to DF fails with toDF()

2015-06-29 Thread Srikanth
My error was related to Scala version. Upon further reading, I realized
that it takes some effort to get Spark working with Scala 2.11.
I've reverted to using 2.10 and moved past that error. Now I hit the issue
you mentioned. Waiting for 1.4.1.

Srikanth

On Fri, Jun 26, 2015 at 9:10 AM, Roberto Coluccio 
roberto.coluc...@gmail.com wrote:

 I got a similar issue. Might your as well be related to this
 https://issues.apache.org/jira/browse/SPARK-8368 ?

 On Fri, Jun 26, 2015 at 2:00 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Those provided spark libraries are compatible with scala 2.11?

 Thanks
 Best Regards

 On Fri, Jun 26, 2015 at 4:48 PM, Srikanth srikanth...@gmail.com wrote:

 Thanks Akhil for checking this out. Here is my build.sbt.

 name := Weblog Analysis

 version := 1.0

 scalaVersion := 2.11.5

 javacOptions ++= Seq(-source, 1.7, -target, 1.7)

 libraryDependencies ++= Seq(
   org.apache.spark %% spark-core % 1.4.0 % provided,
   org.apache.spark %% spark-sql % 1.4.0,
   org.apache.spark %% spark-streaming % 1.4.0,
   org.apache.spark %% spark-streaming-kafka % 1.4.0,
   org.apache.spark %% spark-mllib % 1.4.0,
   org.apache.commons % commons-lang3 % 3.0,
   org.eclipse.jetty  % jetty-client % 8.1.14.v20131031,
   org.scalatest %% scalatest % 2.2.1 % test,
   com.databricks % spark-csv_2.11 % 1.0.3,
   joda-time % joda-time % 2.8.1,
   org.joda  % joda-convert % 1.7
 )

 resolvers ++= Seq(
   Sonatype OSS Snapshots  at 
 http://oss.sonatype.org/content/repositories/snapshots/;,
   Sonatype public at 
 http://oss.sonatype.org/content/groups/public/;,
   Sonatype at 
 http://nexus.scala-tools.org/content/repositories/public;,
   Scala Tools at http://scala-tools.org/repo-snapshots/;,
   Typesafeat 
 http://repo.typesafe.com/typesafe/releases/;,
   Akka at http://akka.io/repository/;,
   JBoss   at 
 http://repository.jboss.org/nexus/content/groups/public/;,
   GuiceyFruit at http://guiceyfruit.googlecode.com/svn/repo/releases/
 
 )

 On Fri, Jun 26, 2015 at 4:13 AM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Its a scala version conflict, can you paste your build.sbt file?

 Thanks
 Best Regards

 On Fri, Jun 26, 2015 at 7:05 AM, stati srikanth...@gmail.com wrote:

 Hello,

 When I run a spark job with spark-submit it fails with below exception
 for
 code line
/*val webLogDF = webLogRec.toDF().select(ip, date,
 name)*/

 I had similar issue running from spark-shell, then realized that I
 needed
 sqlContext.implicit._
 Now my code has the following imports
 /*
  import org.apache.spark._
  import org.apache.spark.sql._
  import org.apache.spark.sql.functions._
  val sqlContext = new SQLContext(sc)
  import sqlContext.implicits._
 */

 Code works fine from spark-shell REPL. It also runs fine when run in
 local
 mode from Eclipse. I get this
 error only when I submit to cluster using spark-submit.
 bin/spark-submit /local/weblog-analysis_2.11-1.0.jar --class
 WebLogAnalysis
 --master spark://machu:7077

 I'm testing with spark 1.4. My code was built using scala 2.11 and
 spark+sparkSQL 1.4.0 as dependency in build.sbt

 Exception in thread main java.lang.NoSuchMethodError:

 scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
 at WebLogAnalysis$.readWebLogFiles(WebLogAnalysis.scala:38)
 at WebLogAnalysis$.main(WebLogAnalysis.scala:62)
 at WebLogAnalysis.main(WebLogAnalysis.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at

 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
 at
 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
 at
 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 I can provide more code or log if that will help. Let me know.

 Srikanth



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499.html
 Sent from the Apache Spark User List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org








Re: Spark 1.4 RDD to DF fails with toDF()

2015-06-26 Thread Akhil Das
Its a scala version conflict, can you paste your build.sbt file?

Thanks
Best Regards

On Fri, Jun 26, 2015 at 7:05 AM, stati srikanth...@gmail.com wrote:

 Hello,

 When I run a spark job with spark-submit it fails with below exception for
 code line
/*val webLogDF = webLogRec.toDF().select(ip, date, name)*/

 I had similar issue running from spark-shell, then realized that I needed
 sqlContext.implicit._
 Now my code has the following imports
 /*
  import org.apache.spark._
  import org.apache.spark.sql._
  import org.apache.spark.sql.functions._
  val sqlContext = new SQLContext(sc)
  import sqlContext.implicits._
 */

 Code works fine from spark-shell REPL. It also runs fine when run in local
 mode from Eclipse. I get this
 error only when I submit to cluster using spark-submit.
 bin/spark-submit /local/weblog-analysis_2.11-1.0.jar --class WebLogAnalysis
 --master spark://machu:7077

 I'm testing with spark 1.4. My code was built using scala 2.11 and
 spark+sparkSQL 1.4.0 as dependency in build.sbt

 Exception in thread main java.lang.NoSuchMethodError:

 scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
 at WebLogAnalysis$.readWebLogFiles(WebLogAnalysis.scala:38)
 at WebLogAnalysis$.main(WebLogAnalysis.scala:62)
 at WebLogAnalysis.main(WebLogAnalysis.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at

 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
 at
 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 I can provide more code or log if that will help. Let me know.

 Srikanth



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Spark 1.4 RDD to DF fails with toDF()

2015-06-26 Thread Akhil Das
Those provided spark libraries are compatible with scala 2.11?

Thanks
Best Regards

On Fri, Jun 26, 2015 at 4:48 PM, Srikanth srikanth...@gmail.com wrote:

 Thanks Akhil for checking this out. Here is my build.sbt.

 name := Weblog Analysis

 version := 1.0

 scalaVersion := 2.11.5

 javacOptions ++= Seq(-source, 1.7, -target, 1.7)

 libraryDependencies ++= Seq(
   org.apache.spark %% spark-core % 1.4.0 % provided,
   org.apache.spark %% spark-sql % 1.4.0,
   org.apache.spark %% spark-streaming % 1.4.0,
   org.apache.spark %% spark-streaming-kafka % 1.4.0,
   org.apache.spark %% spark-mllib % 1.4.0,
   org.apache.commons % commons-lang3 % 3.0,
   org.eclipse.jetty  % jetty-client % 8.1.14.v20131031,
   org.scalatest %% scalatest % 2.2.1 % test,
   com.databricks % spark-csv_2.11 % 1.0.3,
   joda-time % joda-time % 2.8.1,
   org.joda  % joda-convert % 1.7
 )

 resolvers ++= Seq(
   Sonatype OSS Snapshots  at 
 http://oss.sonatype.org/content/repositories/snapshots/;,
   Sonatype public at 
 http://oss.sonatype.org/content/groups/public/;,
   Sonatype at 
 http://nexus.scala-tools.org/content/repositories/public;,
   Scala Tools at http://scala-tools.org/repo-snapshots/;,
   Typesafeat 
 http://repo.typesafe.com/typesafe/releases/;,
   Akka at http://akka.io/repository/;,
   JBoss   at 
 http://repository.jboss.org/nexus/content/groups/public/;,
   GuiceyFruit at http://guiceyfruit.googlecode.com/svn/repo/releases/;
 )

 On Fri, Jun 26, 2015 at 4:13 AM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Its a scala version conflict, can you paste your build.sbt file?

 Thanks
 Best Regards

 On Fri, Jun 26, 2015 at 7:05 AM, stati srikanth...@gmail.com wrote:

 Hello,

 When I run a spark job with spark-submit it fails with below exception
 for
 code line
/*val webLogDF = webLogRec.toDF().select(ip, date, name)*/

 I had similar issue running from spark-shell, then realized that I needed
 sqlContext.implicit._
 Now my code has the following imports
 /*
  import org.apache.spark._
  import org.apache.spark.sql._
  import org.apache.spark.sql.functions._
  val sqlContext = new SQLContext(sc)
  import sqlContext.implicits._
 */

 Code works fine from spark-shell REPL. It also runs fine when run in
 local
 mode from Eclipse. I get this
 error only when I submit to cluster using spark-submit.
 bin/spark-submit /local/weblog-analysis_2.11-1.0.jar --class
 WebLogAnalysis
 --master spark://machu:7077

 I'm testing with spark 1.4. My code was built using scala 2.11 and
 spark+sparkSQL 1.4.0 as dependency in build.sbt

 Exception in thread main java.lang.NoSuchMethodError:

 scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
 at WebLogAnalysis$.readWebLogFiles(WebLogAnalysis.scala:38)
 at WebLogAnalysis$.main(WebLogAnalysis.scala:62)
 at WebLogAnalysis.main(WebLogAnalysis.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at

 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
 at
 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
 at
 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 I can provide more code or log if that will help. Let me know.

 Srikanth



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org






Re: Spark 1.4 RDD to DF fails with toDF()

2015-06-26 Thread Srikanth
Thanks Akhil for checking this out. Here is my build.sbt.

name := Weblog Analysis

version := 1.0

scalaVersion := 2.11.5

javacOptions ++= Seq(-source, 1.7, -target, 1.7)

libraryDependencies ++= Seq(
  org.apache.spark %% spark-core % 1.4.0 % provided,
  org.apache.spark %% spark-sql % 1.4.0,
  org.apache.spark %% spark-streaming % 1.4.0,
  org.apache.spark %% spark-streaming-kafka % 1.4.0,
  org.apache.spark %% spark-mllib % 1.4.0,
  org.apache.commons % commons-lang3 % 3.0,
  org.eclipse.jetty  % jetty-client % 8.1.14.v20131031,
  org.scalatest %% scalatest % 2.2.1 % test,
  com.databricks % spark-csv_2.11 % 1.0.3,
  joda-time % joda-time % 2.8.1,
  org.joda  % joda-convert % 1.7
)

resolvers ++= Seq(
  Sonatype OSS Snapshots  at 
http://oss.sonatype.org/content/repositories/snapshots/;,
  Sonatype public at 
http://oss.sonatype.org/content/groups/public/;,
  Sonatype at 
http://nexus.scala-tools.org/content/repositories/public;,
  Scala Tools at http://scala-tools.org/repo-snapshots/;,
  Typesafeat http://repo.typesafe.com/typesafe/releases/
,
  Akka at http://akka.io/repository/;,
  JBoss   at 
http://repository.jboss.org/nexus/content/groups/public/;,
  GuiceyFruit at http://guiceyfruit.googlecode.com/svn/repo/releases/;
)

On Fri, Jun 26, 2015 at 4:13 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 Its a scala version conflict, can you paste your build.sbt file?

 Thanks
 Best Regards

 On Fri, Jun 26, 2015 at 7:05 AM, stati srikanth...@gmail.com wrote:

 Hello,

 When I run a spark job with spark-submit it fails with below exception for
 code line
/*val webLogDF = webLogRec.toDF().select(ip, date, name)*/

 I had similar issue running from spark-shell, then realized that I needed
 sqlContext.implicit._
 Now my code has the following imports
 /*
  import org.apache.spark._
  import org.apache.spark.sql._
  import org.apache.spark.sql.functions._
  val sqlContext = new SQLContext(sc)
  import sqlContext.implicits._
 */

 Code works fine from spark-shell REPL. It also runs fine when run in local
 mode from Eclipse. I get this
 error only when I submit to cluster using spark-submit.
 bin/spark-submit /local/weblog-analysis_2.11-1.0.jar --class
 WebLogAnalysis
 --master spark://machu:7077

 I'm testing with spark 1.4. My code was built using scala 2.11 and
 spark+sparkSQL 1.4.0 as dependency in build.sbt

 Exception in thread main java.lang.NoSuchMethodError:

 scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
 at WebLogAnalysis$.readWebLogFiles(WebLogAnalysis.scala:38)
 at WebLogAnalysis$.main(WebLogAnalysis.scala:62)
 at WebLogAnalysis.main(WebLogAnalysis.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at

 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
 at
 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
 at
 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 I can provide more code or log if that will help. Let me know.

 Srikanth



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Re: Spark 1.4 RDD to DF fails with toDF()

2015-06-26 Thread Roberto Coluccio
I got a similar issue. Might your as well be related to this
https://issues.apache.org/jira/browse/SPARK-8368 ?

On Fri, Jun 26, 2015 at 2:00 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 Those provided spark libraries are compatible with scala 2.11?

 Thanks
 Best Regards

 On Fri, Jun 26, 2015 at 4:48 PM, Srikanth srikanth...@gmail.com wrote:

 Thanks Akhil for checking this out. Here is my build.sbt.

 name := Weblog Analysis

 version := 1.0

 scalaVersion := 2.11.5

 javacOptions ++= Seq(-source, 1.7, -target, 1.7)

 libraryDependencies ++= Seq(
   org.apache.spark %% spark-core % 1.4.0 % provided,
   org.apache.spark %% spark-sql % 1.4.0,
   org.apache.spark %% spark-streaming % 1.4.0,
   org.apache.spark %% spark-streaming-kafka % 1.4.0,
   org.apache.spark %% spark-mllib % 1.4.0,
   org.apache.commons % commons-lang3 % 3.0,
   org.eclipse.jetty  % jetty-client % 8.1.14.v20131031,
   org.scalatest %% scalatest % 2.2.1 % test,
   com.databricks % spark-csv_2.11 % 1.0.3,
   joda-time % joda-time % 2.8.1,
   org.joda  % joda-convert % 1.7
 )

 resolvers ++= Seq(
   Sonatype OSS Snapshots  at 
 http://oss.sonatype.org/content/repositories/snapshots/;,
   Sonatype public at 
 http://oss.sonatype.org/content/groups/public/;,
   Sonatype at 
 http://nexus.scala-tools.org/content/repositories/public;,
   Scala Tools at http://scala-tools.org/repo-snapshots/;,
   Typesafeat 
 http://repo.typesafe.com/typesafe/releases/;,
   Akka at http://akka.io/repository/;,
   JBoss   at 
 http://repository.jboss.org/nexus/content/groups/public/;,
   GuiceyFruit at http://guiceyfruit.googlecode.com/svn/repo/releases/;
 )

 On Fri, Jun 26, 2015 at 4:13 AM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Its a scala version conflict, can you paste your build.sbt file?

 Thanks
 Best Regards

 On Fri, Jun 26, 2015 at 7:05 AM, stati srikanth...@gmail.com wrote:

 Hello,

 When I run a spark job with spark-submit it fails with below exception
 for
 code line
/*val webLogDF = webLogRec.toDF().select(ip, date, name)*/

 I had similar issue running from spark-shell, then realized that I
 needed
 sqlContext.implicit._
 Now my code has the following imports
 /*
  import org.apache.spark._
  import org.apache.spark.sql._
  import org.apache.spark.sql.functions._
  val sqlContext = new SQLContext(sc)
  import sqlContext.implicits._
 */

 Code works fine from spark-shell REPL. It also runs fine when run in
 local
 mode from Eclipse. I get this
 error only when I submit to cluster using spark-submit.
 bin/spark-submit /local/weblog-analysis_2.11-1.0.jar --class
 WebLogAnalysis
 --master spark://machu:7077

 I'm testing with spark 1.4. My code was built using scala 2.11 and
 spark+sparkSQL 1.4.0 as dependency in build.sbt

 Exception in thread main java.lang.NoSuchMethodError:

 scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
 at WebLogAnalysis$.readWebLogFiles(WebLogAnalysis.scala:38)
 at WebLogAnalysis$.main(WebLogAnalysis.scala:62)
 at WebLogAnalysis.main(WebLogAnalysis.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at

 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
 at
 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
 at
 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 I can provide more code or log if that will help. Let me know.

 Srikanth



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org