It would be helpful if you included relevant configuration files from each or 
if you are using the defaults, particularly any changes to class paths.

I worked through Zeppelin to 0.6.0 at work and at home without any issue so 
hard to say more without having more details.

—
Pedro Rodriguez
PhD Student in Large-Scale Machine Learning | CU Boulder
Systems Oriented Data Scientist
UC Berkeley AMPLab Alumni

pedrorodriguez.io | 909-353-4423
github.com/EntilZha | LinkedIn

On July 9, 2016 at 8:25:30 AM, Mich Talebzadeh (mich.talebza...@gmail.com) 
wrote:

Hi,

I just installed the latest Zeppelin 0.6 as follows:

Source: zeppelin-0.6.0-bin-all

With Spark 1.6.1


Now I am getting this issue with jackson. I did some search  that suggested 
this is caused by the classpath providing you with a different version of 
Jackson than the one Spark is expecting. However, no luck yet. With Spark 1.5.2 
and the previous version of Zeppelin namely 0.5.6-incubating  it used to work 
without problem.

Any ideas will be appreciated


com.fasterxml.jackson.databind.JsonMappingException:
Could not find creator property with name 'id' (in class
org.apache.spark.rdd.RDDOperationScope)

at [Source:
{"id":"14","name":"ExecutedCommand"}; line: 1, column:
1]

at
com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)

at
com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843)

at
com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.addBeanProps(BeanDeserializerFactory.java:533)

at
com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:220)

at
com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:143)

at
com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:409)

at
com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:358)

at
com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:265)

at
com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:245)

at
com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:143)

at
com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:439)

at
com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3666)

at
com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3558)

at
com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2578)

at
org.apache.spark.rdd.RDDOperationScope$.fromJson(RDDOperationScope.scala:85)

at
org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)

at
org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)

at
scala.Option.map(Option.scala:145)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:136)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)

at
org.apache.spark.SparkContext.withScope(SparkContext.scala:714)

at
org.apache.spark.SparkContext.parallelize(SparkContext.scala:728)

at
org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)

at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)

at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)

at
org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)

at
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)

at
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)

at
org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)

at
org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)

at
org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)

at
org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)

at
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)

at
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)

at
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:41)

at
$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)

at
$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)

at
$iwC$$iwC$$iwC.<init>(<console>:47)

at
$iwC$$iwC.<init>(<console>:49)

at
$iwC.<init>(<console>:51)

at
<init>(<console>:53)

at
.<init>(<console>:57)

Dr Mich Talebzadeh
 
LinkedIn  
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
 
http://talebzadehmich.wordpress.com

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, 
damage or destruction of data or any other property which may arise from 
relying on this email's technical content is explicitly disclaimed. The author 
will in no case be liable for any monetary damages arising from such loss, 
damage or destruction.
 

Reply via email to