Hi all,
I am new to spark and seem to have hit a common newbie obstacle.
I have a pretty simple setup and job but I am unable to get past this error
when executing a job:
TaskSchedulerImpl: Initial job has not accepted any resources; check your
cluster UI to ensure that workers are registered
Hi,
I plan to have logstash send log events (as key value pairs) to spark streaming
using Spark on Cassandra.
Being completely fresh to Spark, I have a couple of questions:
- is that a good idea at all, or would it be better to put e.g. Kafka in
between to handle traffic peeks
(IOW: how and