[jira] [Comment Edited] (SPARK-8332) NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer
[ https://issues.apache.org/jira/browse/SPARK-8332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14636919#comment-14636919 ] Earthson Lu edited comment on SPARK-8332 at 7/22/15 1:40 PM: - I recompiled spark with fasterxml.jackson 2.5.3, it works with play-2.4.x Is this ok to use 2.5.3 instead of 2.4.4? was (Author: earthsonlu): I recompiled spark with fasterxml.jackson 2.5.3, it works with play-2.4.x I want to know: is this ok to use 2.5.3 instead of 2.4.4? NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer -- Key: SPARK-8332 URL: https://issues.apache.org/jira/browse/SPARK-8332 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 1.4.0 Environment: spark 1.4 hadoop 2.3.0-cdh5.0.0 Reporter: Tao Li Priority: Critical Labels: 1.4.0, NoSuchMethodError, com.fasterxml.jackson I complied new spark 1.4.0 version. But when I run a simple WordCount demo, it throws NoSuchMethodError {code} java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer {code} I found out that the default fasterxml.jackson.version is 2.4.4. Is there any wrong or conflict with the jackson version? Or is there possibly some project maven dependency containing the wrong version of jackson? -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-8332) NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer
[ https://issues.apache.org/jira/browse/SPARK-8332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14620975#comment-14620975 ] Kevin Tham edited comment on SPARK-8332 at 7/9/15 6:19 PM: --- I had no issue having play-json_2.11:2.4.0 (which depends on jackson 2.5.3) in my SBT build with Spark 1.3, and then upgrading to Spark 1.4 leads me to not be able to do an sc.parallelize(someCollection) where the collection is a Seq[(TreeMap[String, Double], Double)] and am able to reproduce the error Jonathan Kelly saw with (Exception in thread main com.fasterxml.jackson.databind.JsonMappingException: Could not find creator property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope)) I'll help look into this more when I have time to figure out why, I'm interested to see what commit caused this and the actual cause of this error. This is causing us to stick with Spark 1.3 for now. I hope we can prioritize this JIRA for the next 1.4.x release was (Author: ktham): I had no issue having play-json_2.11:2.4.0 (which depends on jackson 2.5.3) in my SBT build with Spark 1.3, and then upgrading to Spark 1.4 leads me to not be able to do an sc.parallelize(someCollection) where the collection is a Seq[(TreeMap[String, Double], Double)] and am able to reproduce the error Jonathan Kelly saw with (Exception in thread main com.fasterxml.jackson.databind.JsonMappingException: Could not find creator property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope)) I would help when I have time to figure out why, but I'm interested to see what commit caused this and the actual cause of this error. This is causing us to stick with Spark 1.3 for now. I hope we can prioritize this JIRA for the next 1.4.x release NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer -- Key: SPARK-8332 URL: https://issues.apache.org/jira/browse/SPARK-8332 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 1.4.0 Environment: spark 1.4 hadoop 2.3.0-cdh5.0.0 Reporter: Tao Li Priority: Critical Labels: 1.4.0, NoSuchMethodError, com.fasterxml.jackson I complied new spark 1.4.0 version. But when I run a simple WordCount demo, it throws NoSuchMethodError {code} java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer {code} I found out that the default fasterxml.jackson.version is 2.4.4. Is there any wrong or conflict with the jackson version? Or is there possibly some project maven dependency containing the wrong version of jackson? -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-8332) NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer
[ https://issues.apache.org/jira/browse/SPARK-8332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14620975#comment-14620975 ] Kevin Tham edited comment on SPARK-8332 at 7/9/15 6:16 PM: --- I had no issue having play-json_2.11:2.4.0 (which depends on jackson 2.5.3) in my SBT build with Spark 1.3, and then upgrading to Spark 1.4 leads me to not be able to do an sc.parallelize(someCollection) where the collection is a Seq[(TreeMap[String, Double], Double)] and am able to reproduce the error Jonathan Kelly saw with (Exception in thread main com.fasterxml.jackson.databind.JsonMappingException: Could not find creator property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope)) I would help when I have time to figure out why, but I'm interested to see what commit caused this and the actual cause of this error. This is causing us to stick with Spark 1.3 for now. I hope we can prioritize this JIRA for the next 1.4.x release was (Author: ktham): I had no issue having play-json_2.11:2.4.0 (which depends on jackson 2.5.3) in my SBT build with Spark 1.3, and then upgrading to Spark 1.4 leads me to not be able to do an sc.parallelize(someCollection) where the collection is a Seq[(TreeMap[String, Double], Double)] and am able to reproduce the error Jonathan Kelly saw with (Exception in thread main com.fasterxml.jackson.databind.JsonMappingException: Could not find creator property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope)) I would help when I have time to figure out why, but I'm interested to see what commit caused this and the actual cause of this error. This is causing us to stick with Spark 1.3 for now. NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer -- Key: SPARK-8332 URL: https://issues.apache.org/jira/browse/SPARK-8332 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 1.4.0 Environment: spark 1.4 hadoop 2.3.0-cdh5.0.0 Reporter: Tao Li Priority: Critical Labels: 1.4.0, NoSuchMethodError, com.fasterxml.jackson I complied new spark 1.4.0 version. But when I run a simple WordCount demo, it throws NoSuchMethodError {code} java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer {code} I found out that the default fasterxml.jackson.version is 2.4.4. Is there any wrong or conflict with the jackson version? Or is there possibly some project maven dependency containing the wrong version of jackson? -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-8332) NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer
[ https://issues.apache.org/jira/browse/SPARK-8332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14603751#comment-14603751 ] Jonathan Kelly edited comment on SPARK-8332 at 6/26/15 11:15 PM: - I was running into the same issue, so I made sure to get rid of all Jackson 2.2 jars from my Hadoop classpath, but now I've run into a different issue: Exception in thread main com.fasterxml.jackson.databind.JsonMappingException: Could not find creator property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope) at [Source: {id:0,name:parallelize}; line: 1, column: 1] at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148) at com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843) at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.addBeanProps(BeanDeserializerFactory.java:533) at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:220) at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:143) at com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:409) at com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:358) at com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:265) at com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:245) at com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:143) at com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:439) at com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3666) at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3558) at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2578) at org.apache.spark.rdd.RDDOperationScope$.fromJson(RDDOperationScope.scala:82) at org.apache.spark.rdd.RDD$$anonfun$34.apply(RDD.scala:1486) at org.apache.spark.rdd.RDD$$anonfun$34.apply(RDD.scala:1486) at scala.Option.map(Option.scala:145) at org.apache.spark.rdd.RDD.init(RDD.scala:1486) at org.apache.spark.rdd.ParallelCollectionRDD.init(ParallelCollectionRDD.scala:85) at org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:697) at org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:695) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109) at org.apache.spark.SparkContext.withScope(SparkContext.scala:681) at org.apache.spark.SparkContext.parallelize(SparkContext.scala:695) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) I'm just running: MASTER=yarn-client /usr/lib/spark/bin/run-example SparkPi 10 I should mention also that I'm using Jackson 2.5.3 for both Hadoop and Spark, rather than 2.4.4, though I wouldn't think that would cause this problem. was (Author: jonathak): I was running into the same issue, so I made sure to get rid of all Jackson 2.2 jars from my Hadoop classpath, but now I've run into a different issue: Exception in thread main com.fasterxml.jackson.databind.JsonMappingException: Could not find creator property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope) at [Source: {id:0,name:parallelize}; line: 1, column: 1] at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148) at com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843) at
[jira] [Comment Edited] (SPARK-8332) NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer
[ https://issues.apache.org/jira/browse/SPARK-8332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14589903#comment-14589903 ] Olivier Girardot edited comment on SPARK-8332 at 6/17/15 3:12 PM: -- I have the same issue on a CDH5 cluster trying to use spark-submit was (Author: ogirardot): I have the same issue on a CDH5 cluster NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer -- Key: SPARK-8332 URL: https://issues.apache.org/jira/browse/SPARK-8332 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 1.4.0 Environment: spark 1.4 hadoop 2.3.0-cdh5.0.0 Reporter: Tao Li Priority: Critical Labels: 1.4.0, NoSuchMethodError, com.fasterxml.jackson I complied new spark 1.4.0 versio. But when I run a simple WordCount demo, it throws NoSuchMethodError java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer. I found the default fasterxml.jackson.version is 2.4.4. It's there any wrong or conflict with the jackson version? Or is there possible some project maven dependency contains wrong version jackson? -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org