Check out HBase's importtsv.
(http://hbase.apache.org/book/ops_mgt.html#importtsv)
-- Lars
- Original Message -
From: iwannaplay games funnlearnfork...@gmail.com
To: hdfs-user@hadoop.apache.org
Cc:
Sent: Thursday, July 19, 2012 3:33 AM
Subject: Re: Loading data in hdfs
Thanks Tariq
Hi,
I am unable to use sqoop and want to load data in hdfs for testing,
Is there any way by which i can load my csv or text file to hadoop
file system directly without writing code in java
Regards
Prabhjot
Hi Prabhjot
Yes, Just use the filesystem commands
hadoop fs -copyFromLocal src fs path destn hdfs path
Regards
Bejoy KS
On Thu, Jul 19, 2012 at 3:49 PM, iwannaplay games
funnlearnfork...@gmail.com wrote:
Hi,
I am unable to use sqoop and want to load data in hdfs for testing,
Is there any
By looking at the csv, I would suggest to create an hbase table, say
'IPINFO' with one column family, say 'cf' having 3 columns for
'startip', 'endip' and 'countryname' respectively.
Regards,
Mohammad Tariq
On Thu, Jul 19, 2012 at 4:44 PM, Mohammad Tariq donta...@gmail.com wrote:
You have
There is absolutely no need to be sorry. And once you have the data
inside your Hdfs you can use importtsv to imort the data like this :
$ bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv
-Dimporttsv.columns=a,b,c tablename hdfs-inputdir
Regards,
Mohammad Tariq
On Thu, Jul 19, 2012 at