Hi Moon

Thanks!!  The fixes proposed in the post resolved my problem.

On the other hand, if this is happening to everybody (as I assume),  maybe this 
should be addressed a bit more systematically??

Thanks again!

Enzo
e...@smartinsightsfromdata.com



> On 1 Mar 2016, at 19:13, moon soo Lee <m...@apache.org> wrote:
> 
> Hi Enzo,
> 
> It happens when you have multiple version of jackson library in your 
> classpath. Please check following email thread
> http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/com-fasterxml-jackson-databind-JsonMappingException-td1607.html
>  
> <http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/com-fasterxml-jackson-databind-JsonMappingException-td1607.html>
> 
> Thanks,
> moon
> 
> On Tue, Mar 1, 2016 at 8:46 AM enzo <e...@smartinsightsfromdata.com 
> <mailto:e...@smartinsightsfromdata.com>> wrote:
> I get the following euro in a variety of circumstances.
> 
> I’ve downloaded zeppelin a couple of days ago.  I use Spark 1.6.0.
> 
> 
> For example:
> 
> %spark
> 
> val raw = sc.textFile("/tmp/github.json”)  // reading a 25Mb file from /tmp
> 
> Gives the following error.  Hey please!!
> 
> 
> com.fasterxml.jackson.databind.JsonMappingException: Could not find creator 
> property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope)
>  at [Source: {"id":"0","name":"textFile"}; line: 1, column: 1]
>       at 
> com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
>       at 
> com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843)
>       at 
> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.addBeanProps(BeanDeserializerFactory.java:533)
>       at 
> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:220)
>       at 
> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:143)
>       at 
> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:409)
>       at 
> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:358)
>       at 
> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:265)
>       at 
> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:245)
>       at 
> com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:143)
>       at 
> com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:439)
>       at 
> com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3666)
>       at 
> com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3558)
>       at 
> com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2578)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.fromJson(RDDOperationScope.scala:85)
>       at 
> org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)
>       at 
> org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)
>       at scala.Option.map(Option.scala:145)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:136)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
>       at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
>       at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:1011)
>       at 
> org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:832)
>       at 
> org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:830)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
>       at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
>       at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)
>       at 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
>       at 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)
>       at 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
>       at 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:47)
>       at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:49)
>       at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:51)
>       at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
>       at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:55)
>       at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
>       at $iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
>       at $iwC$$iwC$$iwC.<init>(<console>:61)
>       at $iwC$$iwC.<init>(<console>:63)
>       at $iwC.<init>(<console>:65)
>       at <init>(<console>:67)
>       at .<init>(<console>:71)
>       at .<clinit>(<console>)
>       at .<init>(<console>:7)
>       at .<clinit>(<console>)
>       at $print(<console>)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>       at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>       at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>       at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>       at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>       at 
> org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:780)
>       at 
> org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:744)
>       at 
> org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:737)
>       at 
> org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
>       at 
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
>       at 
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:331)
>       at org.apache.zeppelin.scheduler.Job.run(Job.java:171)
>       at 
> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>       at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>       at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> 
> 
> 
> 
> Enzo
> e...@smartinsightsfromdata.com <mailto:e...@smartinsightsfromdata.com>
> 
> 
> 

Reply via email to