Hi,
I have Cloudera CDH3 pseudo distributed mode machine for development. I was
completed uploading the bulk load of data into the hbase-0.89 using the
MapReduce program. The hbase was working fine after completed the load. But
when I restarts the machine. I cant able to login into the Hbase datab
s the machine. Hbase is not working. It says as MasterNotRunning
exception.
The size of rows is around close to the 24 millions
Version of hadoop is 0.20
Regards
Jason
On Thu, Feb 17, 2011 at 10:56 PM, Stack wrote:
> On Thu, Feb 17, 2011 at 2:15 AM, praba karan wrote:
> > Hi all
Hi all,
I ve been trying to load the Hbase with huge amount of data into the Hbase
using the Map Reduce program. Hbase table contains the 16 columns and row Id
are generated by the UUID's. When I try to load, It takes a time and gives
the exception as discussed in the following link.
http://web.a
y iPhone 4
>
> On Feb 15, 2011, at 10:26 AM, Ryan Rawson wrote:
>
> > Or the natural business key?
> > On Feb 15, 2011 10:00 AM, "Jean-Daniel Cryans"
> wrote:
> >> Try UUIDs.
> >>
> >> J-D
> >>
> >> On Tue, Feb 15, 2011
Hi,
I am having the Map Reduce program for the uploading the Bulk data into the
Hbase-0.89 from HDFS file system. I need unique row ID for every row
(millions of rows). So that overwriting in the hbase table is to be avoided.
Any solution to overcome the Row ID problem without overwriting in the H
Guys,
I got out of this error! This seems only permission issues to the
$HBASE_HOME/conf/hbase-site.xml. Just copying the hbase-site.xml into the
$HADOOP_HOME/conf/ directory and restarting the cluster resolves the error.
..Jason
On Sat, Feb 5, 2011 at 1:57 PM, praba karan wrote:
> St
f data into it.
>
> St.Ack
>
> On Fri, Feb 4, 2011 at 4:44 AM, praba karan wrote:
> > Hi,
> >
> > I am having the Hadoop-CDH3 pseudo distributed environment. I am running
> the
> > Map Reduce program to do the Bulk load of d
Hi,
I am having the Hadoop-CDH3 pseudo distributed environment. I am running the
Map Reduce program to do the Bulk load of data into the Hbase-0.89, I am
getting the following exception.
org.apache.hadoop.hbase.client.NoServerForRegionException: Timed out trying
to locate root region
> � � � �at
Hey Ryan,
I just got uploaded the small sample data into the Hbase-0.89. I will post
the Map Reduce code after completing the test. I need to get rid of the
exception which I am facing now. When I run the Map Reduce program in my
machine I am getting the following error.
ot enough published
> examples, and I have started accumulating mine here,
> http://hadoopinpractice.com/code.html
>
> <http://hadoopinpractice.com/code.html>Cheers,
> Mark
>
> On Wed, Feb 2, 2011 at 9:56 AM, praba karan wrote:
>
> > Yeah, I had seen this. I developed t
erstand it without complexity
I dont want to use the command line tools such as "completebulkload or
importtsv"
Thank you Mark
Regards
Prabakaran
On Wed, Feb 2, 2011 at 9:18 PM, Stack wrote:
> See http://hbase.apache.org/bulk-loads.html
> St.Ack
>
> On Wed, Feb 2, 201
Thanks Mark,
But this is now what I need?
I am trying to upload the bulk of data from hdfs file system to the
Hbase-0.89. I need the Map Reduce program for that.
Regards
Jason
On Wed, Feb 2, 2011 at 7:46 PM, Mark Kerzner wrote:
> Jason,
>
> attached is RowCounter.java from 0.90 test code -
12 matches
Mail list logo