; TD
>
>
> On Thu, Jul 24, 2014 at 10:28 PM, Alan Ngai wrote:
> bump. any ideas?
>
> On Jul 24, 2014, at 3:09 AM, Alan Ngai wrote:
>
>> it looks like when you configure sparkconfig to use the kryoserializer in
>> combination of using an ActorReceiver, b
Hi,
I’m running into a new problem trying to get streaming going. I have a test
class that sets up my pipeline and runs it fine. The actual production
implementation sets up the pipeline from within an actor. At first, I ran into
a bunch of issues relating to the serialization of closures fr
bump. any ideas?
On Jul 24, 2014, at 3:09 AM, Alan Ngai wrote:
> it looks like when you configure sparkconfig to use the kryoserializer in
> combination of using an ActorReceiver, bad things happen. I modified the
> ActorWordCount example program from
>
> val sparkConf
it looks like when you configure sparkconfig to use the kryoserializer in
combination of using an ActorReceiver, bad things happen. I modified the
ActorWordCount example program from
val sparkConf = new SparkConf().setAppName("ActorWordCount")
to
val sparkConf = new SparkConf()
as minor? It
seems that anyone who uses the windowing functionality would run into this bug.
I imagine this would include anyone who wants to use spark streaming to
aggregate data in fixed time batches, which seems like a fairly common use case.
Alan
On Jul 22, 2014, at 11:30 PM, Alan Ngai
; It could be related to this bug that is currently open.
> https://issues.apache.org/jira/browse/SPARK-1312
>
> Here is a workaround. Can you put a inputStream.foreachRDD(rdd => { }) and
> try these combos again?
>
> TD
>
>
> On Tue, Jul 22, 2014 at 6:01 PM,
I have a sample application pumping out records 1 per second. The batch
interval is set to 5 seconds. Here’s a list of “observed window intervals” vs
what was actually set
window=25, slide=25 : observed-window=25, overlapped-batches=0
window=25, slide=20 : observed-window=20, overlapped-batche