I know these methods , but i need to create events using the timestamps in
the data tuples ,means every time a new tuple is generated using the
timestamp in a CSV file .this will be useful to simulate the data rate
with time just like real sensor data .
On Fri, May 1, 2015 at 2:52 PM, Juan
I have the real DEBS-TAxi data in csv file , in order to operate over it
how to simulate a Spout kind of thing as event generator using the
timestamps in CSV file.
--
Thanks Regards,
Anshu Shukla
Hi,
Maybe you could use streamingContext.fileStream like in the example from
https://spark.apache.org/docs/latest/streaming-programming-guide.html#input-dstreams-and-receivers,
you can read from files on any file system compatible with the HDFS API
(that is, HDFS, S3, NFS, etc.). You could split
I have the real DEBS-TAxi data in csv file , in order to operate over it
how to simulate a Spout kind of thing as event generator using the
timestamps in CSV file.
--
SERC-IISC
Thanks Regards,
Anshu Shukla