Re: Spark Streaming not working

2020-04-14 Thread Gerard Maas
Hi,

Could you share the code that you're using to configure the connection to
the Kafka broker?

This is a bread-and-butter feature. My first thought is that there's
something in your particular setup that prevents this from working.

kind regards, Gerard.

On Fri, Apr 10, 2020 at 7:34 PM Debabrata Ghosh 
wrote:

> Hi,
> I have a spark streaming application where Kafka is producing
> records but unfortunately spark streaming isn't able to consume those.
>
> I am hitting the following error:
>
> 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
> java.lang.AssertionError: assertion failed: Failed to get records for 
> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling for 
> 12
>   at scala.Predef$.assert(Predef.scala:170)
>   at 
> org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
>   at 
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
>   at 
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)
>
>
> Would you please be able to help with a resolution.
>
> Thanks,
> Debu
>


Re: Spark Streaming not working

2020-04-14 Thread Gabor Somogyi
Sorry, hit the send accidentally...

The symptom is simple, the broker is not responding in 120 seconds.
That's the reason why Debabrata asked the broker config.

What I can suggest is to check the previous printout which logs the Kafka
consumer settings.
With the mentioned settings you can start a console consumer on the exact
same host where the executor ran...
If that works you can open a Spark jira with driver and executor logs,
otherwise fix the connection issue.

BR,
G


On Tue, Apr 14, 2020 at 1:32 PM Gabor Somogyi 
wrote:

> The symptom is simple, the broker is not responding in 120 seconds.
> That's the reason why Debabrata asked the broker config.
>
> What I can suggest is to check the previous printout which logs the Kafka
> consumer settings.
> With
>
>
> On Tue, Apr 14, 2020 at 11:44 AM ZHANG Wei  wrote:
>
>> Here is the assertion error message format:
>>
>>s"Failed to get records for $groupId $topic $partition $offset after
>> polling for $timeout")
>>
>> You might have to check the kafka service with the error log:
>>
>> > 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0
>> (TID 24)
>> > java.lang.AssertionError: assertion failed: Failed to get records for
>> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling
>> for 12
>>
>> Cheers,
>> -z
>>
>> ________
>> From: Debabrata Ghosh 
>> Sent: Saturday, April 11, 2020 2:25
>> To: user
>> Subject: Re: Spark Streaming not working
>>
>> Any solution please ?
>>
>> On Fri, Apr 10, 2020 at 11:04 PM Debabrata Ghosh > <mailto:mailford...@gmail.com>> wrote:
>> Hi,
>> I have a spark streaming application where Kafka is producing
>> records but unfortunately spark streaming isn't able to consume those.
>>
>> I am hitting the following error:
>>
>> 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID
>> 24)
>> java.lang.AssertionError: assertion failed: Failed to get records for
>> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling
>> for 12
>> at scala.Predef$.assert(Predef.scala:170)
>> at
>> org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
>> at
>> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
>> at
>> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)
>>
>> Would you please be able to help with a resolution.
>>
>> Thanks,
>> Debu
>>
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>


Re: Spark Streaming not working

2020-04-14 Thread Gabor Somogyi
The symptom is simple, the broker is not responding in 120 seconds.
That's the reason why Debabrata asked the broker config.

What I can suggest is to check the previous printout which logs the Kafka
consumer settings.
With


On Tue, Apr 14, 2020 at 11:44 AM ZHANG Wei  wrote:

> Here is the assertion error message format:
>
>s"Failed to get records for $groupId $topic $partition $offset after
> polling for $timeout")
>
> You might have to check the kafka service with the error log:
>
> > 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0
> (TID 24)
> > java.lang.AssertionError: assertion failed: Failed to get records for
> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling
> for 12
>
> Cheers,
> -z
>
> 
> From: Debabrata Ghosh 
> Sent: Saturday, April 11, 2020 2:25
> To: user
> Subject: Re: Spark Streaming not working
>
> Any solution please ?
>
> On Fri, Apr 10, 2020 at 11:04 PM Debabrata Ghosh  <mailto:mailford...@gmail.com>> wrote:
> Hi,
> I have a spark streaming application where Kafka is producing
> records but unfortunately spark streaming isn't able to consume those.
>
> I am hitting the following error:
>
> 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID
> 24)
> java.lang.AssertionError: assertion failed: Failed to get records for
> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling
> for 12
> at scala.Predef$.assert(Predef.scala:170)
> at
> org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
> at
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
> at
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)
>
> Would you please be able to help with a resolution.
>
> Thanks,
> Debu
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Spark Streaming not working

2020-04-14 Thread ZHANG Wei
Here is the assertion error message format:

   s"Failed to get records for $groupId $topic $partition $offset after polling 
for $timeout")

You might have to check the kafka service with the error log:

> 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
> java.lang.AssertionError: assertion failed: Failed to get records for 
> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling for 
> 12

Cheers,
-z


From: Debabrata Ghosh 
Sent: Saturday, April 11, 2020 2:25
To: user
Subject: Re: Spark Streaming not working

Any solution please ?

On Fri, Apr 10, 2020 at 11:04 PM Debabrata Ghosh 
mailto:mailford...@gmail.com>> wrote:
Hi,
I have a spark streaming application where Kafka is producing records 
but unfortunately spark streaming isn't able to consume those.

I am hitting the following error:

20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
java.lang.AssertionError: assertion failed: Failed to get records for 
spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling for 
12
at scala.Predef$.assert(Predef.scala:170)
at 
org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
at 
org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
at 
org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)

Would you please be able to help with a resolution.

Thanks,
Debu

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark Streaming not working

2020-04-10 Thread Debabrata Ghosh
Any solution please ?

On Fri, Apr 10, 2020 at 11:04 PM Debabrata Ghosh 
wrote:

> Hi,
> I have a spark streaming application where Kafka is producing
> records but unfortunately spark streaming isn't able to consume those.
>
> I am hitting the following error:
>
> 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
> java.lang.AssertionError: assertion failed: Failed to get records for 
> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling for 
> 12
>   at scala.Predef$.assert(Predef.scala:170)
>   at 
> org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
>   at 
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
>   at 
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)
>
>
> Would you please be able to help with a resolution.
>
> Thanks,
> Debu
>


Re: Spark Streaming not working

2020-04-10 Thread Chenguang He
unsubscribe


Re: Spark Streaming not working

2020-04-10 Thread Debabrata Ghosh
Yes the Kafka producer is producing records from the same host - Rechecked
Kafka connection and the connection is there. Came across this URL but
unable to understand it

https://stackoverflow.com/questions/42264669/spark-streaming-assertion-failed-failed-to-get-records-for-spark-executor-a-gro

On Fri, Apr 10, 2020 at 11:14 PM Srinivas V  wrote:

> Check if your broker details are correct, verify if you have network
> connectivity to your client box and Kafka broker server host.
>
> On Fri, Apr 10, 2020 at 11:04 PM Debabrata Ghosh 
> wrote:
>
>> Hi,
>> I have a spark streaming application where Kafka is producing
>> records but unfortunately spark streaming isn't able to consume those.
>>
>> I am hitting the following error:
>>
>> 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
>> java.lang.AssertionError: assertion failed: Failed to get records for 
>> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling for 
>> 12
>>  at scala.Predef$.assert(Predef.scala:170)
>>  at 
>> org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
>>  at 
>> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
>>  at 
>> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)
>>
>>
>> Would you please be able to help with a resolution.
>>
>> Thanks,
>> Debu
>>
>


Re: Spark Streaming not working

2020-04-10 Thread Srinivas V
Check if your broker details are correct, verify if you have network
connectivity to your client box and Kafka broker server host.

On Fri, Apr 10, 2020 at 11:04 PM Debabrata Ghosh 
wrote:

> Hi,
> I have a spark streaming application where Kafka is producing
> records but unfortunately spark streaming isn't able to consume those.
>
> I am hitting the following error:
>
> 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
> java.lang.AssertionError: assertion failed: Failed to get records for 
> spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling for 
> 12
>   at scala.Predef$.assert(Predef.scala:170)
>   at 
> org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
>   at 
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
>   at 
> org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)
>
>
> Would you please be able to help with a resolution.
>
> Thanks,
> Debu
>


Spark Streaming not working

2020-04-10 Thread Debabrata Ghosh
Hi,
I have a spark streaming application where Kafka is producing
records but unfortunately spark streaming isn't able to consume those.

I am hitting the following error:

20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
java.lang.AssertionError: assertion failed: Failed to get records for
spark-executor-service-spark-ingestion dice-ingestion 11 0 after
polling for 12
at scala.Predef$.assert(Predef.scala:170)
at 
org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
at 
org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
at 
org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)


Would you please be able to help with a resolution.

Thanks,
Debu


Re: Spark Streaming not working in YARN mode

2014-11-20 Thread Akhil Das
Cool
On 20 Nov 2014 22:01, kam lee cloudher...@gmail.com wrote:

 Yes, fixed by setting --executor-cores to 2 or higher.

 Thanks a lot! Really appreciate it!
 cloud

 On Wed, Nov 19, 2014 at 10:48 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Make sure the executor cores are set to a value which is = 2 while
 submitting the job.

 Thanks
 Best Regards

 On Thu, Nov 20, 2014 at 10:36 AM, kam lee cloudher...@gmail.com wrote:

 I created a simple Spark Streaming program - it received numbers and
 computed averages and sent the results to Kafka.

 It worked perfectly in local mode as well as standalone master/slave
 mode across a two-node cluster.

 It did not work however in yarn-client or yarn-cluster mode.

 The job was accepted and running on a node but did not produce any
 outputs...

 Any suggestions?

 Thanks!
 cloud






Spark Streaming not working in YARN mode

2014-11-19 Thread kam lee
I created a simple Spark Streaming program - it received numbers and
computed averages and sent the results to Kafka.

It worked perfectly in local mode as well as standalone master/slave mode
across a two-node cluster.

It did not work however in yarn-client or yarn-cluster mode.

The job was accepted and running on a node but did not produce any
outputs...

Any suggestions?

Thanks!
cloud


Re: Spark Streaming not working in YARN mode

2014-11-19 Thread Akhil Das
Make sure the executor cores are set to a value which is = 2 while
submitting the job.

Thanks
Best Regards

On Thu, Nov 20, 2014 at 10:36 AM, kam lee cloudher...@gmail.com wrote:

 I created a simple Spark Streaming program - it received numbers and
 computed averages and sent the results to Kafka.

 It worked perfectly in local mode as well as standalone master/slave mode
 across a two-node cluster.

 It did not work however in yarn-client or yarn-cluster mode.

 The job was accepted and running on a node but did not produce any
 outputs...

 Any suggestions?

 Thanks!
 cloud