Re: Bulk load in hbase using pig

2014-02-26 Thread David McNelis
The big question is how the log file needs to be parsed / formatting. I'd be inclined to write a UDF that would take the line of text and return a tuple of the values you'd be storing in hbase. Then you could do other operations on the bag of tuples that get passed back. Alternatively, you

Re: Bulk load in hbase using pig

2014-02-26 Thread Mohammad Tariq
Could you please let us know how exactly you want to parse your logs? Warm Regards, Tariq cloudfront.blogspot.com On Wed, Feb 26, 2014 at 6:25 PM, David McNelis dmcne...@gmail.com wrote: The big question is how the log file needs to be parsed / formatting. I'd be inclined to write a UDF

Re: Bulk load in hbase using pig

2014-02-26 Thread yonghu
if you want to load hbase log, why do you not directly write MapReduce jobs. In pig, you need to write your customized load function. However, if you write MapReduce job, you can directly use hbase api. On Wed, Feb 26, 2014 at 2:15 PM, Mohammad Tariq donta...@gmail.com wrote: Could you please

Bulk load in hbase using pig

2014-02-25 Thread Chhaya Vishwakarma
hi, I have a log file in HDFS which needs to be parsed and put in a Hbase table. I want to do this using PIG . How can i go about it .Pig script should parse the logs and then put in Hbase? Regards, Chhaya Vishwakarma The contents of this e-mail and any