Re: Is it possible to use json4s 3.2.11 with Spark 1.3.0?

2015-03-29 Thread Alexey Zinoviev
I figured out that Logging is a DeveloperApi and it should not be used outside 
Spark code, so everything is fine now. Thanks again, Marcelo.

 On 24 Mar 2015, at 20:06, Marcelo Vanzin van...@cloudera.com wrote:
 
 From the exception it seems like your app is also repackaging Scala
 classes somehow. Can you double check that and remove the Scala
 classes from your app if they're there?
 
 On Mon, Mar 23, 2015 at 10:07 PM, Alexey Zinoviev
 alexey.zinov...@gmail.com wrote:
 Thanks Marcelo, this options solved the problem (I'm using 1.3.0), but it
 works only if I remove extends Logging from the object, with extends
 Logging it return:
 
 Exception in thread main java.lang.LinkageError: loader constraint
 violation in interface itable initialization: when resolving method
 App1$.logInfo(Lscala/Function0;Ljava/lang/Throwable;)V the class loader
 (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current
 class, App1$, and the class loader (instance of
 sun/misc/Launcher$AppClassLoader) for interface org/apache/spark/Logging
 have different Class objects for the type scala/Function0 used in the
 signature
at App1.main(App1.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at
 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 
 Do you have any idea what's wrong with Logging?
 
 PS: I'm running it with spark-1.3.0/bin/spark-submit --class App1 --conf
 spark.driver.userClassPathFirst=true --conf
 spark.executor.userClassPathFirst=true
 $HOME/projects/sparkapp/target/scala-2.10/sparkapp-assembly-1.0.jar
 
 Thanks,
 Alexey
 
 
 On Tue, Mar 24, 2015 at 5:03 AM, Marcelo Vanzin van...@cloudera.com wrote:
 
 You could build a far jar for your application containing both your
 code and the json4s library, and then run Spark with these two
 options:
 
  spark.driver.userClassPathFirst=true
  spark.executor.userClassPathFirst=true
 
 Both only work in 1.3. (1.2 has spark.files.userClassPathFirst, but
 that only works for executors.)
 
 
 On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev
 alexey.zinov...@gmail.com wrote:
 Spark has a dependency on json4s 3.2.10, but this version has several
 bugs
 and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
 build.sbt and everything compiled fine. But when I spark-submit my JAR
 it
 provides me with 3.2.10.
 
 
 build.sbt
 
 import sbt.Keys._
 
 name := sparkapp
 
 version := 1.0
 
 scalaVersion := 2.10.4
 
 libraryDependencies += org.apache.spark %% spark-core  % 1.3.0 %
 provided
 
 libraryDependencies += org.json4s %% json4s-native % 3.2.11`
 
 
 plugins.sbt
 
 logLevel := Level.Warn
 
 resolvers += Resolver.url(artifactory,
 
 url(http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases;))(Resolver.ivyStylePatterns)
 
 addSbtPlugin(com.eed3si9n % sbt-assembly % 0.13.0)
 
 
 App1.scala
 
 import org.apache.spark.SparkConf
 import org.apache.spark.rdd.RDD
 import org.apache.spark.{Logging, SparkConf, SparkContext}
 import org.apache.spark.SparkContext._
 
 object App1 extends Logging {
  def main(args: Array[String]) = {
val conf = new SparkConf().setAppName(App1)
val sc = new SparkContext(conf)
println(sjson4s version: ${org.json4s.BuildInfo.version.toString})
  }
 }
 
 
 
 sbt 0.13.7, sbt-assembly 0.13.0, Scala 2.10.4
 
 Is it possible to force 3.2.11 version usage?
 
 Thanks,
 Alexey
 
 
 
 --
 Marcelo
 
 
 
 
 
 -- 
 Marcelo


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Is it possible to use json4s 3.2.11 with Spark 1.3.0?

2015-03-24 Thread Marcelo Vanzin
From the exception it seems like your app is also repackaging Scala
classes somehow. Can you double check that and remove the Scala
classes from your app if they're there?

On Mon, Mar 23, 2015 at 10:07 PM, Alexey Zinoviev
alexey.zinov...@gmail.com wrote:
 Thanks Marcelo, this options solved the problem (I'm using 1.3.0), but it
 works only if I remove extends Logging from the object, with extends
 Logging it return:

 Exception in thread main java.lang.LinkageError: loader constraint
 violation in interface itable initialization: when resolving method
 App1$.logInfo(Lscala/Function0;Ljava/lang/Throwable;)V the class loader
 (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current
 class, App1$, and the class loader (instance of
 sun/misc/Launcher$AppClassLoader) for interface org/apache/spark/Logging
 have different Class objects for the type scala/Function0 used in the
 signature
 at App1.main(App1.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:497)
 at
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
 at
 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 Do you have any idea what's wrong with Logging?

 PS: I'm running it with spark-1.3.0/bin/spark-submit --class App1 --conf
 spark.driver.userClassPathFirst=true --conf
 spark.executor.userClassPathFirst=true
 $HOME/projects/sparkapp/target/scala-2.10/sparkapp-assembly-1.0.jar

 Thanks,
 Alexey


 On Tue, Mar 24, 2015 at 5:03 AM, Marcelo Vanzin van...@cloudera.com wrote:

 You could build a far jar for your application containing both your
 code and the json4s library, and then run Spark with these two
 options:

   spark.driver.userClassPathFirst=true
   spark.executor.userClassPathFirst=true

 Both only work in 1.3. (1.2 has spark.files.userClassPathFirst, but
 that only works for executors.)


 On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev
 alexey.zinov...@gmail.com wrote:
  Spark has a dependency on json4s 3.2.10, but this version has several
  bugs
  and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
  build.sbt and everything compiled fine. But when I spark-submit my JAR
  it
  provides me with 3.2.10.
 
 
  build.sbt
 
  import sbt.Keys._
 
  name := sparkapp
 
  version := 1.0
 
  scalaVersion := 2.10.4
 
  libraryDependencies += org.apache.spark %% spark-core  % 1.3.0 %
  provided
 
  libraryDependencies += org.json4s %% json4s-native % 3.2.11`
 
 
  plugins.sbt
 
  logLevel := Level.Warn
 
  resolvers += Resolver.url(artifactory,
 
  url(http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases;))(Resolver.ivyStylePatterns)
 
  addSbtPlugin(com.eed3si9n % sbt-assembly % 0.13.0)
 
 
  App1.scala
 
  import org.apache.spark.SparkConf
  import org.apache.spark.rdd.RDD
  import org.apache.spark.{Logging, SparkConf, SparkContext}
  import org.apache.spark.SparkContext._
 
  object App1 extends Logging {
def main(args: Array[String]) = {
  val conf = new SparkConf().setAppName(App1)
  val sc = new SparkContext(conf)
  println(sjson4s version: ${org.json4s.BuildInfo.version.toString})
}
  }
 
 
 
  sbt 0.13.7, sbt-assembly 0.13.0, Scala 2.10.4
 
  Is it possible to force 3.2.11 version usage?
 
  Thanks,
  Alexey



 --
 Marcelo





-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Is it possible to use json4s 3.2.11 with Spark 1.3.0?

2015-03-23 Thread Alexey Zinoviev
Thanks Ted, I'll try, hope there's no transitive dependencies on 3.2.10.

On Tue, Mar 24, 2015 at 4:21 AM, Ted Yu yuzhih...@gmail.com wrote:

 Looking at core/pom.xml :
 dependency
   groupIdorg.json4s/groupId
   artifactIdjson4s-jackson_${scala.binary.version}/artifactId
   version3.2.10/version
 /dependency

 The version is hard coded.

 You can rebuild Spark 1.3.0 with json4s 3.2.11

 Cheers

 On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev 
 alexey.zinov...@gmail.com wrote:

 Spark has a dependency on json4s 3.2.10, but this version has several
 bugs and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
 build.sbt and everything compiled fine. But when I spark-submit my JAR it
 provides me with 3.2.10.


 build.sbt

 import sbt.Keys._

 name := sparkapp

 version := 1.0

 scalaVersion := 2.10.4

 libraryDependencies += org.apache.spark %% spark-core  % 1.3.0 %
 provided

 libraryDependencies += org.json4s %% json4s-native % 3.2.11`


 plugins.sbt

 logLevel := Level.Warn

 resolvers += Resolver.url(artifactory, url(
 http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases
 ))(Resolver.ivyStylePatterns)

 addSbtPlugin(com.eed3si9n % sbt-assembly % 0.13.0)


 App1.scala

 import org.apache.spark.SparkConf
 import org.apache.spark.rdd.RDD
 import org.apache.spark.{Logging, SparkConf, SparkContext}
 import org.apache.spark.SparkContext._

 object App1 extends Logging {
   def main(args: Array[String]) = {
 val conf = new SparkConf().setAppName(App1)
 val sc = new SparkContext(conf)
 println(sjson4s version: ${org.json4s.BuildInfo.version.toString})
   }
 }



 sbt 0.13.7, sbt-assembly 0.13.0, Scala 2.10.4

 Is it possible to force 3.2.11 version usage?

 Thanks,
 Alexey





Re: Is it possible to use json4s 3.2.11 with Spark 1.3.0?

2015-03-23 Thread Alexey Zinoviev
Thanks Marcelo, this options solved the problem (I'm using 1.3.0), but it
works only if I remove extends Logging from the object, with extends
Logging it return:

Exception in thread main java.lang.LinkageError: loader constraint
violation in interface itable initialization: when resolving method
App1$.logInfo(Lscala/Function0;Ljava/lang/Throwable;)V the class loader
(instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current
class, App1$, and the class loader (instance of
sun/misc/Launcher$AppClassLoader) for interface org/apache/spark/Logging
have different Class objects for the type scala/Function0 used in the
signature
at App1.main(App1.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Do you have any idea what's wrong with Logging?

PS: I'm running it with spark-1.3.0/bin/spark-submit --class App1 --conf
spark.driver.userClassPathFirst=true --conf
spark.executor.userClassPathFirst=true
$HOME/projects/sparkapp/target/scala-2.10/sparkapp-assembly-1.0.jar

Thanks,
Alexey


On Tue, Mar 24, 2015 at 5:03 AM, Marcelo Vanzin van...@cloudera.com wrote:

 You could build a far jar for your application containing both your
 code and the json4s library, and then run Spark with these two
 options:

   spark.driver.userClassPathFirst=true
   spark.executor.userClassPathFirst=true

 Both only work in 1.3. (1.2 has spark.files.userClassPathFirst, but
 that only works for executors.)


 On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev
 alexey.zinov...@gmail.com wrote:
  Spark has a dependency on json4s 3.2.10, but this version has several
 bugs
  and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
  build.sbt and everything compiled fine. But when I spark-submit my JAR it
  provides me with 3.2.10.
 
 
  build.sbt
 
  import sbt.Keys._
 
  name := sparkapp
 
  version := 1.0
 
  scalaVersion := 2.10.4
 
  libraryDependencies += org.apache.spark %% spark-core  % 1.3.0 %
  provided
 
  libraryDependencies += org.json4s %% json4s-native % 3.2.11`
 
 
  plugins.sbt
 
  logLevel := Level.Warn
 
  resolvers += Resolver.url(artifactory,
  url(http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases
 ))(Resolver.ivyStylePatterns)
 
  addSbtPlugin(com.eed3si9n % sbt-assembly % 0.13.0)
 
 
  App1.scala
 
  import org.apache.spark.SparkConf
  import org.apache.spark.rdd.RDD
  import org.apache.spark.{Logging, SparkConf, SparkContext}
  import org.apache.spark.SparkContext._
 
  object App1 extends Logging {
def main(args: Array[String]) = {
  val conf = new SparkConf().setAppName(App1)
  val sc = new SparkContext(conf)
  println(sjson4s version: ${org.json4s.BuildInfo.version.toString})
}
  }
 
 
 
  sbt 0.13.7, sbt-assembly 0.13.0, Scala 2.10.4
 
  Is it possible to force 3.2.11 version usage?
 
  Thanks,
  Alexey



 --
 Marcelo



Re: Is it possible to use json4s 3.2.11 with Spark 1.3.0?

2015-03-23 Thread Ted Yu
Looking at core/pom.xml :
dependency
  groupIdorg.json4s/groupId
  artifactIdjson4s-jackson_${scala.binary.version}/artifactId
  version3.2.10/version
/dependency

The version is hard coded.

You can rebuild Spark 1.3.0 with json4s 3.2.11

Cheers

On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev alexey.zinov...@gmail.com
wrote:

 Spark has a dependency on json4s 3.2.10, but this version has several bugs
 and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
 build.sbt and everything compiled fine. But when I spark-submit my JAR it
 provides me with 3.2.10.


 build.sbt

 import sbt.Keys._

 name := sparkapp

 version := 1.0

 scalaVersion := 2.10.4

 libraryDependencies += org.apache.spark %% spark-core  % 1.3.0 %
 provided

 libraryDependencies += org.json4s %% json4s-native % 3.2.11`


 plugins.sbt

 logLevel := Level.Warn

 resolvers += Resolver.url(artifactory, url(
 http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases
 ))(Resolver.ivyStylePatterns)

 addSbtPlugin(com.eed3si9n % sbt-assembly % 0.13.0)


 App1.scala

 import org.apache.spark.SparkConf
 import org.apache.spark.rdd.RDD
 import org.apache.spark.{Logging, SparkConf, SparkContext}
 import org.apache.spark.SparkContext._

 object App1 extends Logging {
   def main(args: Array[String]) = {
 val conf = new SparkConf().setAppName(App1)
 val sc = new SparkContext(conf)
 println(sjson4s version: ${org.json4s.BuildInfo.version.toString})
   }
 }



 sbt 0.13.7, sbt-assembly 0.13.0, Scala 2.10.4

 Is it possible to force 3.2.11 version usage?

 Thanks,
 Alexey



Re: Is it possible to use json4s 3.2.11 with Spark 1.3.0?

2015-03-23 Thread Marcelo Vanzin
You could build a far jar for your application containing both your
code and the json4s library, and then run Spark with these two
options:

  spark.driver.userClassPathFirst=true
  spark.executor.userClassPathFirst=true

Both only work in 1.3. (1.2 has spark.files.userClassPathFirst, but
that only works for executors.)


On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev
alexey.zinov...@gmail.com wrote:
 Spark has a dependency on json4s 3.2.10, but this version has several bugs
 and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
 build.sbt and everything compiled fine. But when I spark-submit my JAR it
 provides me with 3.2.10.


 build.sbt

 import sbt.Keys._

 name := sparkapp

 version := 1.0

 scalaVersion := 2.10.4

 libraryDependencies += org.apache.spark %% spark-core  % 1.3.0 %
 provided

 libraryDependencies += org.json4s %% json4s-native % 3.2.11`


 plugins.sbt

 logLevel := Level.Warn

 resolvers += Resolver.url(artifactory,
 url(http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases;))(Resolver.ivyStylePatterns)

 addSbtPlugin(com.eed3si9n % sbt-assembly % 0.13.0)


 App1.scala

 import org.apache.spark.SparkConf
 import org.apache.spark.rdd.RDD
 import org.apache.spark.{Logging, SparkConf, SparkContext}
 import org.apache.spark.SparkContext._

 object App1 extends Logging {
   def main(args: Array[String]) = {
 val conf = new SparkConf().setAppName(App1)
 val sc = new SparkContext(conf)
 println(sjson4s version: ${org.json4s.BuildInfo.version.toString})
   }
 }



 sbt 0.13.7, sbt-assembly 0.13.0, Scala 2.10.4

 Is it possible to force 3.2.11 version usage?

 Thanks,
 Alexey



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org