Hi I am learning spark streaming, and is trying out the JavaNetworkCount example.
#1 - This is the code I wrote JavaStreamingContext sctx = new JavaStreamingContext("local", appName, new Duration(5000)); JavaReceiverInputDStream<String> lines = sctx.socketTextStream("127.0.0.1", 9999); JavaDStream<String> words =lines.flatMap( new FlatMapFunction<String, String>() { @Override public Iterable<String> call(String arg0) throws Exception { System.out.println("Print text:" + arg0); return Arrays.asList(arg0.split(" ")); } }); #2 - This is the socketCode I am using import java.io.BufferedReader; import java.io.DataOutputStream; import java.io.InputStreamReader; import java.net.ServerSocket; import java.net.Socket; public class TestTcpServer { public static void main(String argv[]) throws Exception { String clientSentence; String capitalizedSentence; ServerSocket welcomeSocket = new ServerSocket(9999); int i = 0; while(true) { Socket connectionSocket = welcomeSocket.accept(); BufferedReader inFromClient = new BufferedReader( new InputStreamReader(connectionSocket.getInputStream()) ); DataOutputStream outToClient = new DataOutputStream(connectionSocket.getOutputStream()); while(true) { String sendingStr = "Sending... data... " + i; outToClient.writeBytes(sendingStr); System.out.println(sendingStr); i++; Thread.sleep(3000); } } } } What I am trying to do is to get the JavaNetworkCount in #1 to start printing all the text I am receiving. But so far I failed to achieve that. I have been using Hercules Setup <http://www.hw-group.com/products/hercules/details_en.html> to simulate as a TCP server, as well as a simple serversocket code in #2... But I am not seeing any text being printed on the console. Is public Iterable<String> call(String arg0) throws Exception being called every 5 secs? The console log is in http://pastebin.com/THzdzGhg <http://pastebin.com/THzdzGhg> -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Cannot-get-socketTextStream-to-receive-anything-tp9382.html Sent from the Apache Spark User List mailing list archive at Nabble.com.