varun senthilnathan created SPARK-31413:
-------------------------------------------

             Summary: Accessing the sequence number and partition id for 
records in Kinesis adapter
                 Key: SPARK-31413
                 URL: https://issues.apache.org/jira/browse/SPARK-31413
             Project: Spark
          Issue Type: Question
          Components: DStreams
    Affects Versions: 2.4.3
         Environment: We are using spark 2.4.3 in java. We would like to log 
the partition key and the sequence number of every event. The overloaded create 
stream function of the kinesis utils always throws a compilation error.

 

{{Function<Record,Record> printSeq = s -> s;}}

{{KinesisUtils.createStream(}}jssc, appName, streamName, endPointUrl, 
regionName, InitialPositionInStream.TRIM_HORIZON, kinesisCheckpointInterval, 
StorageLevel.MEMORY_AND_DISK_SER(), printSeq, Record.class);

The exception is as follows:
{quote}no suitable method found for 
createStream(org.apache.spark.streaming.api.java.JavaStreamingContext,java.lang.String,java.lang.String,java.lang.String,java.lang.String,com.amazonaws.services.kinesis.clientlibrary.lib.worker.InitialPositionInStream,org.apache.spark.streaming.Duration,org.apache.spark.storage.StorageLevel,java.util.function.Function,java.lang.Class)
{quote}
JAVA DOCS: 
[https://spark.apache.org/docs/latest/api/java/org/apache/spark/streaming/kinesis/KinesisUtils.html#createStream-org.apache.spark.streaming.api.java.JavaStreamingContext-java.lang.String-java.lang.String-java.lang.String-java.lang.String-com.amazonaws.services.kinesis.clientlibrary.lib.worker.InitialPositionInStream-org.apache.spark.streaming.Duration-org.apache.spark.storage.StorageLevel-org.apache.spark.api.java.function.Function-java.lang.Class-]

Is there a way out?
            Reporter: varun senthilnathan


We 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to