I will suggest that don't pipe the sensor data to the HDFS directly instead
you can have some program(either java,python etc) on the server itself to
process the incoming sensor data and writing it to the text/binary
file(don't know the data format which you are currently receiving).now you
can put your data file on the HDFS alternatively you can directly process
the data and save to your HBASE managed on the HDFS.

if your sensor data is log data then you can use flume to load that data
into the HDFS directly.

Thanks

::::::::::::::::::::::::::::::::::::::::
Raj K Singh
http://in.linkedin.com/in/rajkrrsingh
http://www.rajkrrsingh.blogspot.com
Mobile  Tel: +91 (0)9899821370


On Wed, May 7, 2014 at 8:18 AM, Alex Lee <eliy...@hotmail.com> wrote:

> Sensors' may send tcpip data to server. Each sensor may send tcpip data
> like a stream to the server, the quatity of the sensors and the data rate
> of the data is high.
>
> Firstly, how the data from tcpip can be put into hadoop. It need to do
> some process and store in hbase. Does it need through save to data files
> and put into hadoop or can be done in some direct ways from tcpip. Is there
> any software module can take care of this. Searched that Ganglia Nagios and
> Flume may do it. But when looking into details, ganglia and nagios are
> more for monitoring hadoop cluster itself. Flume is for log files.
>
> Secondly, if the total network traffic from sensors are over the limit of
> one lan port, how to share the loads, is there any component in hadoop to
> make this done automatically.
>
> Any suggestions, thanks.
>

Reply via email to