Also I should mention that the `stream` Dstream definition is:

JavaInputDStream<ConsumerRecord<String, byte[]>> stream =
KafkaUtils.createDirectStream(
        ssc,
        LocationStrategies.PreferConsistent(),
        ConsumerStrategies.<String, byte[]>Subscribe(TOPIC, kafkaParams)
);


On Thu, Dec 14, 2017 at 10:30 AM, Soheil Pourbafrani <soheil.i...@gmail.com>
wrote:

> The following code is in SparkStreaming :
>
> JavaInputDStream<String> results = stream.map(record -> 
> SparkTest.getTime(record.value()) + ":"
>         + Long.toString(System.currentTimeMillis()) + ":"
>         + Arrays.deepToString(SparkTest.finall(record.value()))
>         + ":" + Long.toString(System.currentTimeMillis()))
>         .map(record -> record + ":"
>                 + Long.toString(Long.parseLong(record.split(":")[3])
>                 - Long.parseLong(record.split(":")[1])));
>
> the `stream` DStream type is `byte[]`. getTime output type is `String`.
> `final` output is `String[][][]`. But this code is with error
> Error:(52, 21) java: incompatible types: no instance(s) of type
> variable(s) R exist so that org.apache.spark.streaming.api.java.JavaDStream<R>
> conforms to org.apache.spark.streaming.api.java.JavaInputDStream<
> java.lang.String>
>
>
> Why? All outputs operations are in String type!!!
>

Reply via email to