On 02/07/12 10:34, syed kather wrote:
> java.lang.RuntimeException: org.xml.sax.SAXParseException: Content is not
> allowed in prolog.
> at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1237)
> at
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.j
On 21/06/12 14:33, Michael Segel wrote:
> I think the version issue is the killer factor here.
> Usually performing a simple get() where you are getting the latest version of
> the data on the row/cell occurs in some constant time k. This is constant
> regardless of the size of the cluster and s
On 21/06/12 14:33, Michael Segel wrote:
> I think the version issue is the killer factor here.
> Usually performing a simple get() where you are getting the latest version of
> the data on the row/cell occurs in some constant time k. This is constant
> regardless of the size of the cluster and s
On 19/06/12 19:31, Jean-Daniel Cryans wrote:
> This is a common but hard problem. I do not have a good answer.
Thanks for Your writeup. You've given a few suggestions, that I will
surely follow.
But what is bothering me, is my use of timestamps. As mentioned before,
my column family has 214748364
Hi
I've run into some performance issues with my hadoop MapReduce Job.
Basically what I'm doing with it is:
- read data from HDFS file
- the output goes also to HDFS file (multiple ones in my scenerio)
- in my mapper I process each line and enrich it with some data read
from HBase table (I do Get
On 17/04/12 18:45, Alex Baranau wrote:
> I don't think that your error is related to CPs stuff. What lib versions do
> you use? Can you compare with those of the HBaseHUT pom?
Ok, I've managed to track down the source of my error. If I do normal
Put modifications in my prePut/postPut method everyt
On 17/04/12 18:45, Alex Baranau wrote:
> I don't think that your error is related to CPs stuff. What lib versions do
> you use? Can you compare with those of the HBaseHUT pom?
Ok, I've managed to track down the source of my error. If I do normal
Put modifications in my prePut/postPut method everyt
On 16/04/12 16:49, Alex Baranau wrote:
> Here's some code that worked for me [1]. You may also find useful to look
> at the pom's dependencies [2].
Thanks, Your cluster initialization is certainly more elegant than what
I had. However it still gives me the same error as I reported. Moreover,
I've
On 17/04/12 15:15, Alex Baranau wrote:
Hi
> Some sanity checks:
> 1) make sure you don't have 127.0.1.1 in your /etc/hosts (only 127.0.0.1)
I've removed this entry and it worked right away :) Could You explain
why it did so big difference?
Now the test from HBaseHUT works fine, but mine code is
On 16/04/12 16:49, Alex Baranau wrote:
> Here's some code that worked for me [1]. You may also find useful to look
> at the pom's dependencies [2].
Thanks, Your cluster initialization is certainly more elegant than what
I had. However it still gives me the same error as I reported. Moreover,
I've
Hi
I'm trying to write a unit test for HBase coprocessor. However it seems
I'm doing something horribly wrong. The code I'm using to test my
coprocessor class is in the attachment.
As you can see, I'm using HBaseTestingUtility, and running a
mini-cluster with it. The error I keep getting is:
201
11 matches
Mail list logo