[ https://issues.apache.org/jira/browse/SPARK-27833?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16848840#comment-16848840 ]
Gabor Somogyi commented on SPARK-27833: --------------------------------------- Maybe you could just copy the working sink with a different name, change things step-by-step until it breaks. At the first glance this doesn't look like a Spark problem. > java.lang.AssertionError: assertion failed: No plan for EventTimeWatermark > ---------------------------------------------------------------------------- > > Key: SPARK-27833 > URL: https://issues.apache.org/jira/browse/SPARK-27833 > Project: Spark > Issue Type: Bug > Components: Structured Streaming > Affects Versions: 2.3.0 > Environment: spark 2.3.0 > java 1.8 > kafka version 0.10. > Reporter: Raviteja > Priority: Minor > Labels: spark-streaming-kafka > Attachments: kafka_consumer_code.java, kafka_custom_sink.java, > kafka_error_log.txt > > > Hi , > We have a requirement to read data from kafka, apply some transformation and > store data to database .For this we are implementing watermarking feature > along with aggregate function and for storing we are writing our own sink > (Structured streaming) .we are using spark 2.3.0, java 1.8 and kafka version > 0.10. > We are getting the below error. > "*java.lang.AssertionError: assertion failed: No plan for EventTimeWatermark > timestamp#39: timestamp, interval 2 minutes*" > > works perfectly fine when we use Console as sink instead custom sink. For > Debugging the issue, we are performing "dataframe.show()" in our custom sink > and nothing else. > Please find the attachment for the Error log and the code. Please look into > this issue as this a blocker and we are not able to proceed further or find > any alternatives as we need watermarking feature. > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org