[ 
https://issues.apache.org/jira/browse/SPARK-2103?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14068263#comment-14068263
 ] 

Apache Spark commented on SPARK-2103:
-------------------------------------

User 'jerryshao' has created a pull request for this issue:
https://github.com/apache/spark/pull/1508

> Java + Kafka + Spark Streaming NoSuchMethodError in java.lang.Object.<init>
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-2103
>                 URL: https://issues.apache.org/jira/browse/SPARK-2103
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.0.0
>            Reporter: Sean Owen
>
> This has come up a few times, from user venki-kratos:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-td2209.html
> and I ran into it a few weeks ago:
> http://mail-archives.apache.org/mod_mbox/spark-dev/201405.mbox/%3ccamassdlzs6ihctxepusphryxxa-wp26zgbxx83sm6niro0q...@mail.gmail.com%3E
> and yesterday user mpieck:
> {quote}
> When I use the createStream method from the example class like
> this:
> KafkaUtils.createStream(jssc, "zookeeper:port", "test", topicMap);
> everything is working fine, but when I explicitely specify message decoder
> classes used in this method with another overloaded createStream method:
> KafkaUtils.createStream(jssc, String.class, String.class,
> StringDecoder.class, StringDecoder.class, props, topicMap,
> StorageLevels.MEMORY_AND_DISK_2);
> the applications stops with an error:
> 14/06/10 22:28:06 ERROR kafka.KafkaReceiver: Error receiving data
> java.lang.NoSuchMethodException:
> java.lang.Object.<init>(kafka.utils.VerifiableProperties)
>         at java.lang.Class.getConstructor0(Unknown Source)
>         at java.lang.Class.getConstructor(Unknown Source)
>         at
> org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:108)
>         at
> org.apache.spark.streaming.dstream.NetworkReceiver.start(NetworkInputDStream.scala:126)
> {quote}
> Something is making it try to instantiate java.lang.Object as if it's a 
> Decoder class.
> I suspect that the problem is to do with
> https://github.com/apache/spark/blob/master/external/kafka/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala#L148
> {code}
>     implicit val keyCmd: Manifest[U] =
> implicitly[Manifest[AnyRef]].asInstanceOf[Manifest[U]]
>     implicit val valueCmd: Manifest[T] =
> implicitly[Manifest[AnyRef]].asInstanceOf[Manifest[T]]
> {code}
> ... where U and T are key/value Decoder types. I don't know enough Scala to 
> fully understand this, but is it possible this causes the reflective call 
> later to lose the type and try to instantiate Object? The AnyRef made me 
> wonder.
> I am sorry to say I don't have a PR to suggest at this point.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to