Great, thanks guys, that helped a lot and I've got a sample working.
As a follow up, when do worker/masters become necessity?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-and-Flume-integration-do-I-understand-this-correctly-tp10879p10908.html
Sent
Hi,
Deploying spark with Flume is pretty simple. What you'd need to do is:
1. Start your spark Flume DStream Receiver on some machine using one of
the FlumeUtils.createStream methods - where you need to specify the
hostname and port of the worker node on which you want the spark
executor to r
Hari, can you help?
TD
On Tue, Jul 29, 2014 at 12:13 PM, dapooley wrote:
> Hi,
>
> I am trying to integrate Spark onto a Flume log sink and avro source. The
> sink is on one machine (the application), and the source is on another. Log
> events are being sent from the application server to the av