Unless there are some unstated requirements, a simple Java program can do what
you want.
pseudo code:
open csv file using java.io.File
open table in hbase: HTable table = new HTable(conf,mytablename)
while (more records in csv file) {
read line from csv, split into columns
id = table.startUpdate(rowname)
for (each column-name, value-pair) {
table.put(id, column-name, column-value)
}
table.commit(id)
}
---
Jim Kellerman, Senior Engineer; Powerset
> -----Original Message-----
> From: Ved Prakash [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, March 11, 2008 2:41 AM
> To: [EMAIL PROTECTED]
> Subject: loading data into hbase table
>
> Hi friends,
>
> I have a table dump in csv format, I wanted to load this data
> into my hbase table instead of typing it as inserts. I did a
> web search and also looked into the hbase documentation but
> couldn't find any thing.
>
> Can someone tell me how to load a file from local disk into
> hbase table on hdfs?
>
> Thanks
>
> No virus found in this incoming message.
> Checked by AVG.
> Version: 7.5.518 / Virus Database: 269.21.7/1322 - Release
> Date: 3/9/2008 12:17 PM
>
>
No virus found in this outgoing message.
Checked by AVG.
Version: 7.5.518 / Virus Database: 269.21.7/1322 - Release Date: 3/9/2008 12:17
PM