Hi,
Thanks Harsh. :)
I try this configuration but it still doesn't work.
The problem looks like that Classloader cannot find the mapper class, which has
already been packed into the same JAR as other classes.
Why the Classloader cannot find a class in the same JAR? And this JAR should
have been
Hello,
On Thu, Oct 28, 2010 at 9:02 AM, exception wrote:
> Hi forks,
/me branches self into two and cheers ;-)
>
> had...@master:~/eqin/joblauncher$ hadoop jar Greper.jar org.taomee.Greper
> -libjars Greper.jar jobconf/jobconfig.xml
In your job setup, do you do a JobConf.setJarByClass (or JobC
Hi forks,
I am under hadoop 0.21.0.
What I am trying to do is loading the mapper/reducer class at runtime according
to a config file. I use -libjars to shipping JAR files that a job is dependent
on.
But seems it doesn't work. :(
This is the error message:
had...@master:~/eqin/joblauncher$ had
Hi, thank you very much for your reply.
I want to modify 0.20.2
On Wed, Oct 27, 2010 at 7:12 PM, Ted Yu wrote:
> Which hadoop version do you want to modify ?
>
>
> On Wed, Oct 27, 2010 at 2:28 PM, Shen LI wrote:
>
>> Hi,
>>
>> I want to modify the heartbeat message to carry more information fr
Which hadoop version do you want to modify ?
On Wed, Oct 27, 2010 at 2:28 PM, Shen LI wrote:
> Hi,
>
> I want to modify the heartbeat message to carry more information from
> worker nodes to master node, and also want to modify the return message of
> heartbeat. Do you know which file and functi
Hi,
I want to modify the heartbeat message to carry more information from worker
nodes to master node, and also want to modify the return message of
heartbeat. Do you know which file and function should I modify?
Thanks a lot,
Shen
Using the HBase API in your mapper:
http://hbase.apache.org/docs/current/api/org/apache/hadoop/hbase/client/HTable.html#put(java.util.List)
J-D
On Wed, Oct 27, 2010 at 2:10 PM, Shuja Rehman wrote:
> Actually I am not using reducers. Only mappers work for me
> Secondly the procedure was mapper ou
Actually I am not using reducers. Only mappers work for me
Secondly the procedure was mapper output saved in files which then
transferred to mysql using sqoop. so here i need to save output to files +
sent data to hbase. suppose i use output format to save data into file then
how to send data to hb
Do both insertions in your reducer by either not using the output
formats at all or use one of them and do the other insert by hand.
J-D
On Wed, Oct 27, 2010 at 1:44 PM, Shuja Rehman wrote:
> Hi Folks
>
> I am wondering if anyone has the answer of this question. I am processing
> log files using
Hi Folks
I am wondering if anyone has the answer of this question. I am processing
log files using Map reduce and get data to put some part into mysql and rest
of hbase. At the moment, i am running two separate jobs to do this so
reading single file for 2 times to dump the data. My questions is th
pos can been get through InputStream wrapped in RecordReader. You can
refer to org.apache.hadoop.mapreduce.lib.input.LineRecordReader for
details.
On Wed, Oct 27, 2010 at 5:37 PM, Bibek Paudel wrote:
> [Apologies for cross-posting]
>
> HI all,
> I am rewriting a hadoop java code for the new (0.
[Apologies for cross-posting]
HI all,
I am rewriting a hadoop java code for the new (0.20.2) API- the code
was originally written for versions <= 0.19.
1. What is the equivalent of getPos() method [0] of RecordReader ?
I read that in 0.20, the getPos() methos is no longer there [1], but what
am
12 matches
Mail list logo