Hi,
can you please take a screen shot and show us the number of records that
the streaming programme is reading from the source? If I am not mistaken it
should be able to write out records to the output location every 5 mins.
Also, it may be of help to check whether you have permissions to write
It writes data every 5 seconds. under checkpoint directory
gs://testdata/raw_chk what do you see?
You should have four entries under chkpt directory example
/mnt/gs/prices/chkpt> ltr
total 1
-rw-r--rwx. 1 hduser hadoop 45 May 4 07:38 metadata
drwxr-xrwx. 3 hduser hadoop 4096 May 4 07:38
Hello all,
I’m just trying to build a pipeline reading data from a streaming source
and write to orc file. But I don’t see any file that is written to the file
system nor any exceptions
Here is an example
val df = spark.readStream.format(“...")
.option(
“Topic",
"Some