Thanks for your reply! In my use case, it would be stream from only one stdin. 
Also I'm working with Scala.
It would be great if you could talk about multi stdin case as well! Thanks.

From: Tathagata Das <t...@databricks.com<mailto:t...@databricks.com>>
Date: Thursday, June 11, 2015 at 8:11 PM
To: Heath Guo <heath...@fb.com<mailto:heath...@fb.com>>
Cc: user <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: Spark Streaming reads from stdin or output from command line 
utility

Are you going to receive data from one stdin from one machine, or many stdins 
on many machines?


On Thu, Jun 11, 2015 at 7:25 PM, foobar 
<heath...@fb.com<mailto:heath...@fb.com>> wrote:
Hi, I'm new to Spark Streaming, and I want to create a application where
Spark Streaming could create DStream from stdin. Basically I have a command
line utility that generates stream data, and I'd like to pipe data into
DStream. What's the best way to do that? I thought rdd.pipe() could help,
but it seems that requires an rdd in the first place, which does not apply.
Thanks!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-reads-from-stdin-or-output-from-command-line-utility-tp23289.html<https://urldefense.proofpoint.com/v1/url?u=http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-reads-from-stdin-or-output-from-command-line-utility-tp23289.html&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=4Z2U8tLm1orBgymimfryIw%3D%3D%0A&m=4O1SseOzl0OsOY1s4%2B3jfsvy21wseYOJS0gxhf1IYc8%3D%0A&s=3df5e3f1e40970c1cb5191b7e3d6c9957c86993d2ac1f2d7fb6b622c7ebeccdd>
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>


Reply via email to