The big question is how the log file needs to be parsed / formatting. I'd
be inclined to write a UDF that would take the line of text and return a
tuple of the values you'd be storing in hbase.
Then you could do other operations on the bag of tuples that get passed
back.
Alternatively, you
Could you please let us know how exactly you want to parse your logs?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Feb 26, 2014 at 6:25 PM, David McNelis dmcne...@gmail.com wrote:
The big question is how the log file needs to be parsed / formatting. I'd
be inclined to write a UDF
if you want to load hbase log, why do you not directly write MapReduce
jobs. In pig, you need to write your customized load function. However, if
you write MapReduce job,
you can directly use hbase api.
On Wed, Feb 26, 2014 at 2:15 PM, Mohammad Tariq donta...@gmail.com wrote:
Could you please
hi,
I have a log file in HDFS which needs to be parsed and put in a Hbase table.
I want to do this using PIG .
How can i go about it .Pig script should parse the logs and then put in Hbase?
Regards,
Chhaya Vishwakarma
The contents of this e-mail and any