I am trying to connect spark streaming with flume with pull mode. I have three machine and each one runs spark and flume agent at the same time, where they are master, slave1, slave2. I have set flume sink to slave1 on port 6689. Telnet slave1 6689 on other two machine works well.
In my code, I set FlumeUtils.createStream(ssc,"slave1",6689), and submit it on master machine with --master local[2]. Then it throw the error: ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - org.jboss.netty.channel.ChannelException: Failed to bind to: slave1/10.25.*.*:6689 Caused by: java.net.BindException: Cannot assign requested address I have read this thread https://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/Spark-Streaming-Fails-on-Cluster-mode-Flume-as-source/m-p/25577/highlight/true#M621 but i am sure no process use this port by checking netstat -anp | grep 6689. Also the ip address 10.25.** is not a routable address. So someone can help me to solve it? Regard, Junfeng Chen