I had this same problem as well.  I ended up just adding the necessary code
in KafkaUtil and compiling my own spark jar.  Something like this for the
"raw" stream:

  def createRawStream(
      jssc: JavaStreamingContext,
      kafkaParams: JMap[String, String],
      topics: JMap[String, JInt]
   ): JavaPairDStream[Array[Byte], Array[Byte]] = {
    new KafkaInputDStream[Array[Byte], Array[Byte], DefaultDecoder,
DefaultDecoder](
      jssc.ssc, kafkaParams.toMap,
Map(topics.mapValues(_.intValue()).toSeq: _*),
StorageLevel.MEMORY_AND_DISK_SER_2)
  }


On Tue, Jun 10, 2014 at 2:15 PM, mpieck <mpi...@gazeta.pl> wrote:

> Hi,
>
> I have the same problem when running Kafka to Spark Streaming pipeline from
> Java with explicitely specified message decoders. I had thought, that it
> was
> related to Eclipse environment, as suggested here, but it's not the case. I
> have coded an example based on class:
>
>
> https://github.com/apache/spark/blob/branch-0.9/examples/src/main/java/org/apache/spark/streaming/examples/JavaKafkaWordCount.java
>
> and have builded shaded uber jar with all the deps and tried to run it from
> command line. When I use the createStream method from the example class
> like
> this:
>
> KafkaUtils.createStream(jssc, "zookeeper:port", "test", topicMap);
>
> everything is working fine, but when I explicitely specify message decoder
> classes used in this method with another overloaded createStream method:
>
> KafkaUtils.createStream(jssc, String.class, String.class,
> StringDecoder.class, StringDecoder.class, props, topicMap,
> StorageLevels.MEMORY_AND_DISK_2);
>
> the applications stops with an error:
>
> 14/06/10 22:28:06 ERROR kafka.KafkaReceiver: Error receiving data
> java.lang.NoSuchMethodException:
> java.lang.Object.<init>(kafka.utils.VerifiableProperties)
>         at java.lang.Class.getConstructor0(Unknown Source)
>         at java.lang.Class.getConstructor(Unknown Source)
>         at
>
> org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:108)
>         at
>
> org.apache.spark.streaming.dstream.NetworkReceiver.start(NetworkInputDStream.scala:126)
>
> I have tried Spark versions 0.9.0-incubating, 0.9.0 and 0.9.1, but the
> error
> occurs everywhere. Kafka StringDecoder class has the constructor with
> VerifiableProperties parameter and all required classes are in the same
> uber
> jar, so it is strange that scala/java cannot find it with reflection api.
> Maybe there is some problem with Manifest/ClassTag usage in KafkaUtils or
> KafkaInputDStream classes, but I'm not a Scala expert and cannot be sure
> about it. The problematic code is the same from version 0.9 to the current
> one, so it's still there. Unit test from the Spark project is working fine
> with every KafkaUtils method, because the test does not try to register the
> kafka stream, only checks the interface.
>
> Currently it is possible to use Kafka to Spark Streaming pipeline from Java
> only with the default String message decoders, which makes this tool almost
> useless (unless you are a great JSON fan).
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p7347.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to