Re: Issue with Spark Streaming UI

2016-05-16 Thread Mich Talebzadeh
Have you check Streaming tab in Spark GUI? [image: Inline images 1] HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw *

Issue with Spark Streaming UI

2016-05-14 Thread Sachin Janani
Hi, I'm trying to run a simple spark streaming application with File Streaming and its working properly but when I try to monitor the number of events in the Streaming Ui it shows that as 0.Is this a issue and are there any plans to fix this. Regards, SJ

broadcast variable and accumulators issue while spark streaming checkpoint recovery

2015-07-29 Thread Shushant Arora
Hi I am using spark streaming 1.3 and using checkpointing. But job is failing to recover from checkpoint on restart. For broadcast variable it says : 1.WARN TaskSetManager: Lost task 15.0 in stage 7.0 (TID 1269, hostIP): java.lang.ClassCastException: [B cannot be cast to

Re: broadcast variable and accumulators issue while spark streaming checkpoint recovery

2015-07-29 Thread Tathagata Das
Rather than using accumulator directly, what you can do is something like this to lazily create an accumulator and use it (will get lazily recreated if driver restarts from checkpoint) dstream.transform { rdd = val accum = SingletonObject.getOrCreateAccumulator() // single object method to

Re: broadcast variable and accumulators issue while spark streaming checkpoint recovery

2015-07-29 Thread Shushant Arora
1.How to do it in java? 2.Can broadcast objects also be created in same way after checkpointing. 3.Is it safe If I disable checkpoint and write offsets at end of each batch to hdfs in mycode and somehow specify in my job to use this offset for creating kafkastream at first time. How can I

Re: broadcast variable and accumulators issue while spark streaming checkpoint recovery

2015-07-29 Thread Tathagata Das
1. Same way, using static fields in a class. 2. Yes, same way. 3. Yes, you can do that. To differentiate from first time v/s continue, you have to build your own semantics. For example, if the location in HDFS you are suppose to store the offsets does not have any data, that means its probably

Checkpoint issue in spark streaming

2015-07-28 Thread Sadaf
? Thanks :) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Checkpoint-issue-in-spark-streaming-tp24031.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Some Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2015-01-01 Thread zgm
I am also seeing this error in a YARN spark streaming (1.2.0) application Tim Smith wrote Similar issue (Spark 1.0.0). Streaming app runs for a few seconds before these errors start to pop all over the driver logs: 14/09/12 17:30:23 WARN TaskSetManager: Loss was due to java.lang.Exception

Re: Some Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2015-01-01 Thread Sean Owen
used for further processing. The custom Receiver working fine in normal load with no issues. But when I tested this with huge amount of backlog messages from Kafka ( 50 million + messages), I see couple of major issue in Spark Streaming. Wanted to get some opinion on this I am using

Issue in Spark Streaming

2014-11-04 Thread Suman S Patil
I am trying to run the Spark streaming program as given in the Spark streaming Programming guidehttps://spark.apache.org/docs/latest/streaming-programming-guide.html, in the interactive shell. I am getting an error as shown herefile:///C:\Users\10609685\Desktop\stream-spark.png as an

Re: Issue in Spark Streaming

2014-11-04 Thread Akhil Das
Which error are you referring here? Can you paste the error logs? Thanks Best Regards On Wed, Nov 5, 2014 at 11:04 AM, Suman S Patil suman.pa...@lntinfotech.com wrote: I am trying to run the Spark streaming program as given in the Spark streaming Programming guide

Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2014-09-18 Thread Rafeeq S
Hi, I am testing kafka-spark streaming application which throws below error after few seconds and below configuration is used for spark streaming test environment. kafka version- 0.8.1 spark version- 1.0.1 SPARK_MASTER_MEMORY=1G SPARK_DRIVER_MEMORY=1G SPARK_WORKER_INSTANCES=1

RE: Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2014-09-18 Thread Shao, Saisai
Subject: Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed.. Hi, I am testing kafka-spark streaming application which throws below error after few seconds and below configuration is used for spark streaming test environment. kafka version- 0.8.1 spark version- 1.0.1

Re: Some Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2014-09-12 Thread Dibyendu Bhattacharya
Union stream which used for further processing. The custom Receiver working fine in normal load with no issues. But when I tested this with huge amount of backlog messages from Kafka ( 50 million + messages), I see couple of major issue in Spark Streaming. Wanted to get some opinion

Re: Some Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2014-09-12 Thread Jeoffrey Lim
working fine in normal load with no issues. But when I tested this with huge amount of backlog messages from Kafka ( 50 million + messages), I see couple of major issue in Spark Streaming. Wanted to get some opinion on this I am using latest Spark 1.1 taken from the source and built it. Running

Re: Some Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2014-09-12 Thread Dibyendu Bhattacharya
in normal load with no issues. But when I tested this with huge amount of backlog messages from Kafka ( 50 million + messages), I see couple of major issue in Spark Streaming. Wanted to get some opinion on this I am using latest Spark 1.1 taken from the source and built it. Running

Re: Some Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2014-09-12 Thread Tim Smith
Similar issue (Spark 1.0.0). Streaming app runs for a few seconds before these errors start to pop all over the driver logs: 14/09/12 17:30:23 WARN TaskSetManager: Loss was due to java.lang.Exception java.lang.Exception: Could not compute split, block input-4-1410542878200 not found

Re: Some Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2014-09-11 Thread Nan Zhu
+ messages), I see couple of major issue in Spark Streaming. Wanted to get some opinion on this I am using latest Spark 1.1 taken from the source and built it. Running in Amazon EMR , 3 m1.xlarge Node Spark cluster running in Standalone Mode. Below are two main question I have.. 1. What I

Re: Some Serious Issue with Spark Streaming ? Blocks Getting Removed and Jobs have Failed..

2014-09-11 Thread Nan Zhu
Receiver working fine in normal load with no issues. But when I tested this with huge amount of backlog messages from Kafka ( 50 million + messages), I see couple of major issue in Spark Streaming. Wanted to get some opinion on this I am using latest Spark 1.1 taken from the source

Issue during Spark streaming with ZeroMQ source

2014-04-29 Thread Francis . Hu
Hi, all I installed spark-0.9.1 and zeromq 4.0.1 , and then run below example: ./bin/run-example org.apache.spark.streaming.examples.SimpleZeroMQPublisher tcp://127.0.1.1:1234 foo.bar` ./bin/run-example org.apache.spark.streaming.examples.ZeroMQWordCount local[2] tcp://127.0.1.1:1234 foo`

Re: Issue during Spark streaming with ZeroMQ source

2014-04-29 Thread Prashant Sharma
Unfortunately zeromq 4.0.1 is not supported. https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/streaming/examples/ZeroMQWordCount.scala#L63Says about the version. You will need that version of zeromq to see it work. Basically I have seen it working nicely with

Re: 答复: Issue during Spark streaming with ZeroMQ source

2014-04-29 Thread Prashant Sharma
with a newer zeromq that would be better for us. Francis. *发件人:* Prashant Sharma [mailto:scrapco...@gmail.com] *发送时间:* Tuesday, April 29, 2014 15:53 *收件人:* user@spark.apache.org *主题:* Re: Issue during Spark streaming with ZeroMQ source Unfortunately zeromq 4.0.1 is not supported. https