If local[2] is expected, then the streaming doc is actually misleading?
as the given example is
import org.apache.spark.api.java.function._
import org.apache.spark.streaming._
import org.apache.spark.streaming.api._
// Create a StreamingContext with a local master
val ssc = new StreamingContext
Yeah - Spark streaming needs at least two threads to run. I actually
thought we warned the user if they only use one (@tdas?) but the
warning might not be working correctly - or I'm misremembering.
On Fri, May 30, 2014 at 6:38 AM, Sean Owen wrote:
> Thanks Nan, that does appear to fix it. I was u
Thanks Nan, that does appear to fix it. I was using "local". Can
anyone say whether that's to be expected or whether it could be a bug
somewhere?
On Fri, May 30, 2014 at 2:42 PM, Nan Zhu wrote:
> Hi, Sean
>
> I was in the same problem
>
> but when I changed MASTER=“local” to MASTER=“local[2]”
>
>
Hi, Sean
I was in the same problem
but when I changed MASTER=“local” to MASTER=“local[2]”
everything back to the normal
Hasn’t get a chance to ask here
Best,
--
Nan Zhu
On Friday, May 30, 2014 at 9:09 AM, Sean Owen wrote:
> Guys I'm struggling to debug some strange behavior in a sim
Guys I'm struggling to debug some strange behavior in a simple
Streaming + Java + Kafka example -- in fact, a simplified version of
JavaKafkaWordcount, that is just calling print() on a sequence of
messages.
Data is flowing, but it only appears to work for a few periods --
sometimes 0 -- before ce