Hi, Could you please guide me how to take logs from my remote system,requirements as follows:
I have Single node cluster Hadoop 1.2.1 ,Flume 1.4.1 running on ubuntu system,Now my requirement is to take the logs of a particular application running on My remote machine (Windows). For this I have written .conf file as follows # Name the components on this agent a2.sources = r1 a2.sinks = k1 a2.channels = c1 # Describe/configure the source a2.sources.r1.type = avro a2.sources.r1.bind =localhost* //// remote machine ip address(is it correct ?)* a2.sources.r1.port =4444 *////port number* ( # Describe the sink a2.sinks.k1.type = hdfs a2.sinks.k1.hdfs.path= hdfs://localhost:54310/exec # Use a channel which buffers events in memory a2.channels.c1.type = memory a2.channels.c1.capacity = 1000 a2.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a2.sources.r1.channels = c1 a2.sinks.k1.channel = c1 Apart from that , what are the things needs to be taken care here, like where to declare client agent ,flume agent . Do I need to write any Java program additionally (as I am not good at Java program). *I would be so thankful If Get the help* -- thanks®ards Nagarjuna.S -- thanks®ards Nagarjuna.S
