Thanks for you help Samuel. I was having problem in both writing and
reading. Did run the fsck and removed some damaged files and restarted
the dfs. Seems to be OK now. Not exactly sure what happened though.
Thanks,
Htin
-Original Message-
From: Samuel Guo [mailto:[EMAIL PROTECTED]
It seems that I encountered a similar problem:
Zlib , lzo installed.
Running ant -Dcompile.native=true gave the following error.
[exec]
/server/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
In function
On Oct 9, 2008, at 6:46 PM, Songting Chen wrote:
Thanks, Arun.
Does that mean I have to rebuild the native library?
No, hadoop releases come bundled with pre-built 32/64 bit libhadoop.so
for Linux...
Arun
Also, the LZO installation puts liblzo2.a and liblzo2.la under /usr/
local/lib.
Hi,
To get a number of reduce_output_records, I was write code as:
long rows = rJob.getCounters().findCounter(
org.apache.hadoop.mapred.Task$Counter, 8, REDUCE_OUTPUT_RECORDS)
.getCounter();
I want to know other method to get it since findCounter(String group,
int id, String
On Oct 10, 2008, at 12:51 AM, Songting Chen wrote:
Thanks Arun.
Still sort of confused.
If I don't need to rebuild the library, after lzo is installed, it's
still not working.
Could you please elaborate? What is the error/exception?
Arun
My code:
writer =
hi,
I try to use contrib/index/hadoop-0.17.1-index.jar build lucene index, but
Directory NoClassDefFoundError.
How solve? Thinks.
Additional, hadoop run Pseudo-Distributed Mode. and can run bin/hadoop jar
hadoop-*-examples.jar grep input output 'dfs[a-z.]+'
-
2008-10-10
chenlbspace
__ NOD32 3510 (20081010) Information __
This message was checked by NOD32 antivirus system.
http://www.nod32cn.com
U must download the lucene binary jar file,and set the jar file in the
classpath.
chenlbspace 写道:
hi,
I try to use contrib/index/hadoop-0.17.1-index.jar build lucene index, but
Directory NoClassDefFoundError.
How solve? Thinks.
Additional, hadoop run Pseudo-Distributed Mode. and can run
Hey all,
Hadoop tries to parse file names with : in them as a relative URL:
[EMAIL PROTECTED] ~]$ hadoop fs -put /tmp/test /user/brian/StageOutTest-24328-
Fri-Oct-10-07:58:44-2008
put: Pathname /user/brian/StageOutTest-24328-Fri-Oct-10-07:58:44-2008
from
Hi,
We are using Hadoop 0.16 and on our heavy IO job we are seeing lot of these
exceptions.
We are seeing lot of task failures more than 50% :(. They are two reasons from
log:
a) Task task_200810092310_0003_m_20_0 failed to report status for 600 seconds. Killing! -
b)
On 10/9/08 6:46 PM, Songting Chen [EMAIL PROTECTED] wrote:
Does that mean I have to rebuild the native library?
Also, the LZO installation puts liblzo2.a and liblzo2.la under /usr/local/lib.
There is no liblzo2.so there. Do I need to rename them to liblzo2.so somehow?
You need to
The jar file is in the src/contrib/index/lib/lucene-core-2.3.1.jar, u
must set the file in classpath.
imcaptor 写道:
U must download the lucene binary jar file,and set the jar file in the
classpath.
chenlbspace 写道:
hi,
I try to use contrib/index/hadoop-0.17.1-index.jar build lucene
Try do this.
In the hadoop path,
-bash-3.00$ pwd
/data/hadoop/hadoop-0.18.1
cp src/contrib/index/lib/lucene-core-2.3.1.jar lib
Then you can run the task.
chenlbspace 写道:
hi,
I try to use contrib/index/hadoop-0.17.1-index.jar build lucene index, but
Directory NoClassDefFoundError.
How
hey every one
I use the hadoop-0.17.2.1. the slaves file in conf directory file is:
rack02
rack03
rack07
rack05
rack12
rack14
But When I use start-all.sh, all the datanodes have started. But I use
hadoop dfsadmin -report, there is only 2 datanodes:
Datanodes available: 2
Name:
I was checking out this slide show.
http://www.slideshare.net/jhammerb/2008-ur-tech-talk-zshao-presentation/
in the diagram a Web-UI exists. This was the first I have heard of
this. Is this part of or planned to be a part of contrib/hive? I think
a web interface for showing table schema and
The safest thing is to restrict your Hadoop file names to a
common-denominator set of characters that are well supported by Unix,
Windows, and URIs. Colon is a special character on both Windows and in
URIs. Quoting is in theory possible, but it's hard to get it right
everywhere in practice.
Hey Edward,
The UI mentioned in those slides leverages many internal display
libraries from Facebook. If you wanted to make a UI that leverages the
metastore's thrift interface but only uses open source display
libraries, I think it would definitely be appreciated by Hive users.
Thanks,
Jeff
Thanks, Allen.
I checked the INSTALL doc in lzo-2.03 package.
It's said to use './configure --enable-shared' command to build shared library.
I followed that instruction and recompile / install the package.
But after that, still only .a, .la appear in /usr/local/lib.
-rw-r--r-- 1 root root
That code is in, unfortunately it doesn't quite solve the problem;
you'd need to do some more work. You'd have to write subclasses that
spit out the statistics you want. Then set the appropriate options in
hadoop-site, so that those classes get loaded.
On Wed, Oct 8, 2008 at 12:30 PM, George
I switched to lzo-2.02 package. This time liblzo2.so was built.
Now everything worked.
Thanks,
-Songting
--- On Fri, 10/10/08, Songting Chen [EMAIL PROTECTED] wrote:
From: Songting Chen [EMAIL PROTECTED]
Subject: Re: How to make LZO work?
To: core-user@hadoop.apache.org
Date: Friday,
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hi,
Often I get entries like the following (see below)
on HMR jobs using streaming. The job does appear to complete successfully
without any failed/killed tasks .
All the streaming job's mappers are reading there portion of the lines from the
same
21 matches
Mail list logo