This works fine with Kafka and Flink. However when I do it with spark , new
line feed gets removed.
On Tue, Feb 16, 2016 at 4:29 PM, UMESH CHAUDHARY
wrote:
> Try to print RDD before writing to validate that you are getting '\n' from
> Kafka.
>
> On Tue, Feb 16, 2016 at 4:19
Try to print RDD before writing to validate that you are getting '\n' from
Kafka.
On Tue, Feb 16, 2016 at 4:19 PM, Ashutosh Kumar
wrote:
> Hi Chandeep,
> Thanks for response. Issue is the new line feed is lost. All records
> appear in one line only.
>
> Thanks
>
Hi Chandeep,
Thanks for response. Issue is the new line feed is lost. All records appear
in one line only.
Thanks
Ashutosh
On Tue, Feb 16, 2016 at 3:26 PM, Chandeep Singh wrote:
> !rdd.isEmpty() should work but an alternative could be rdd.take(1) != 0
>
> On Feb 16, 2016, at
!rdd.isEmpty() should work but an alternative could be rdd.take(1) != 0
> On Feb 16, 2016, at 9:33 AM, Ashutosh Kumar wrote:
>
> I am getting multiple empty files for streaming output for each interval.
> To Avoid this I tried
>
> kStream.foreachRDD(new
I am getting multiple empty files for streaming output for each interval.
To Avoid this I tried
kStream.foreachRDD(new VoidFunction2(){
*public void call(JavaRDD rdd,Time time) throws Exception {
if(!rdd.isEmpty()){
Request to provide some pointer on this.
Thanks
On Mon, Feb 15, 2016 at 3:39 PM, Ashutosh Kumar
wrote:
> I am getting multiple empty files for streaming output for each interval.
> To Avoid this I tried
>
> kStream.foreachRDD(new VoidFunction2(){
>
I am getting multiple empty files for streaming output for each interval.
To Avoid this I tried
kStream.foreachRDD(new VoidFunction2(){
*public void call(JavaRDD rdd,Time time) throws Exception {
if(!rdd.isEmpty()){