Hi scorpio,

 thanks for your reply. 
I don't understand your approach. Is it possible to receive data from
different clients throught the same port on Spark?

Surely I'm confused and I'd appreciate your opinion.

Regarding the word count example , from Spark Streaming documentation, Spark
acts as a client that connects to a remote server, in order te receive data:

/// Create a DStream that will connect to hostname:port, like localhost:9999
JavaReceiverInputDStream<String> lines = jssc.socketTextStream("localhost",
9999);/

Then, you create a dummy server using nc receive connections request from
spark, and to send data:

/nc -lk 9999/

So, regarding this implementation, as spark is playing the role of tcp
client. you'd need to manage the join of external sensors streams (by the
way, all with the same schema) in your own server.
How would you be able to make Spark acts as a "sink" that can receive
different sources stream throught the same port??








--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Join-streams-Apache-Spark-tp28603p28670.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to