Hello
I can't find a email like:hdfs-u...@hadoop.apach.org.
Sorry,I don't know what you say
On 12/28/2013 04:52 PM, r...@fwpsystems.com wrote:
Sehr geehrte Damen und Herren,
Herr Pappert ist nicht mehr für unser Unternehmen tätig.
Ihre E-Mail wird nicht weitergeleitet.
Bitte schicken Sie uns Ihre E-Mail an i...@luenebits.de
Sie können uns wie folgt
Ok,maybe you said this email is deprecated,but I am keeping received
email from :hdfs-user@hadoop.apache.org.
On 12/28/2013 04:54 PM, r...@fwpsystems.com wrote:
Sehr geehrte Damen und Herren,
Herr Pappert ist nicht mehr für unser Unternehmen tätig.
Ihre E-Mail wird nicht weitergeleitet.
Bitte
Hello
Can anybody tell me a way to unsubscribe from this list?
I can't find a way to do it.
can't find a way to unsubscribe from this list.
can't find a way to unsubscribe from this list.
can't find a way to unsubscribe from this list.
I can't googling it out,has this renamed?
BUILD FAILED
.../branch-0 .20-append/build.xml:927: The following error
occurred while executing this line:
../branch-0 .20-append/build.xml:933: exec returned: 1
Total time: 1 minute 17 seconds
+ RESULT=1
+ '[' 1 '!=' 0 ']'
+ echo 'Build Failed: 64-bit build not run'
Build Failed: 64-bit
BUILD FAILED
.../branch-0 .20-append/build.xml:927: The following error
occurred while executing this line:
../branch-0 .20-append/build.xml:933: exec returned: 1
Total time: 1 minute 17 seconds
+ RESULT=1
+ '[' 1 '!=' 0 ']'
+ echo 'Build Failed: 64-bit build not run'
Build Failed: 64-bit
Got this error,configuration is :
I have checked out source
from:https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-append/
this is my build script:
//
#!/bin/bash
VERSION=0.20.0-append
I download cloudera CDH3 beta:hadoop-0.20.2+228,and modified three
files:hdfs.xml,core-site.xml and hadoop-env.sh.and I do have set
JAVA_HOME in file:hadoop-env.sh,and then try to run:start-dfs.sh,got
this error,but strange thing is that namenode is running.I can't
understand why.Any help is
Hello
I followed this
guide:http://mail-archives.apache.org/mod_mbox/hadoop-common-user/201006.mbox/AANLkTileo-
q8useip8y3na9pdyhlyufippr0in0lk...@mail.gmail.com ,then run:
hadoop jar hadoop-examples-0.20.2+320.jar grep input output 'dfs[a-z.]+',
got error:Caused by:
Hello
I use java.I think the problem is that I can't get lzo library
loaded,and I can't get example run successfully.so It is not the problem of
programing language.
On Monday, August 16, 2010 03:43:12 pm rosefinny111 wrote:
hi friend u write program which language
means it in the java
Hi,
At every beginning,I run:hadoop jar hadoop-*-examples.jar grep input output
'dfs[a-z.]+' successfully,but when run:
nutch crawl url -dir crawl -depth 3,got errors:
-
Hi,
At every beginning,I run:hadoop jar hadoop-*-examples.jar grep input output
'dfs[a-z.]+' successfully,but when run:
nutch crawl url -dir crawl -depth 3,got errors:
-
Hi,
At every beginning,I run:hadoop jar hadoop-*-examples.jar grep input output
'dfs[a-z.]+' successfully,but when run:
nutch crawl url -dir crawl -depth 3,got errors:
-
Does it(hadoop-lzo) only work for hadoop 0.20,not work for 0.21 or 0.22?
On Friday, August 06, 2010 09:05:47 am Todd Lipcon wrote:
On Thu, Aug 5, 2010 at 4:52 PM, Bobby Dennett bdenn...@gmail.com wrote:
Hi Josh,
No real pain points... just trying to investigate/research the best
way to
Hello:
I have followed this link:http://code.google.com/p/hadoop-gpl-
compression/wiki/FAQ to install lzo compression library,and copy
hadoop-lzo-0.4.4.jar to $HADOOP_HOME/lib,and all files under
..lib/native/Linux-amd64-64 to $HADOOP_HOME/lib/native/Linux-amd64-64,
and run example,but got
Hello:
I have followed this link:http://code.google.com/p/hadoop-gpl-
compression/wiki/FAQ to install lzo compression library,and copy
hadoop-lzo-0.4.4.jar to $HADOOP_HOME/lib,and all files under
..lib/native/Linux-amd64-64 to $HADOOP_HOME/lib/native/Linux-amd64-64,
and run example,but got
Hello:
I got source code from http://github.com/kevinweil/hadoop-lzo,compiled
them successfully,and then
1,copy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each master and
slave
2,Copy all files under directory:../Linux-amd64-64/lib to directory:
Hello:
I got source code from http://github.com/kevinweil/hadoop-lzo,compiled
them successfully,and then
1,copy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each master and
slave
2,Copy all files under directory:../Linux-amd64-64 to directory:
$HADDOOP_HOME/lib/native/Linux-amd64-64
picked up straight away and that you have
to restart job-trackers and task-trackers for them to be used in
map-reduce jobs.
Good luck!
Thanks,
Jamie
On 24 July 2010 08:40, Alex Luya alexander.l...@gmail.com wrote:
Hello:
I got source code from http://github.com/kevinweil/hadoop
Hello:
I got source code from http://github.com/kevinweil/hadoop-lzo,compiled
them successfully,and then
1,copy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each master and
slave
2,Copy all files under directory:../Linux-amd64-64 to directory:
$HADDOOP_HOME/lib/native/Linux-amd64-64
Hello:
I got source code from http://github.com/kevinweil/hadoop-lzo,compiled
them successfully,and then
1,copy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each master and
slave
2,Copy all files under directory:../Linux-amd64-64 to directory:
Hello:
when run hadoop dfs -put src des,I got an
error:java.io.IOException: File
/user/alex/hadoop-alex-namenode-AlexLuya.log could only be replicated to
0 nodes, instead of 1,and I have
checked logs in namenode and datanode and secondary,only this error is
presented in namenode:
Hello:
when run hadoop dfs -put src des,I got an
error:java.io.IOException: File
/user/alex/hadoop-alex-namenode-AlexLuya.log could only be replicated to
0 nodes, instead of 1,and I have
checked logs in namenode and datanode and secondary,only this error is
presented in namenode:
Hello
here is the output of hadoop fsck /:
Status: HEALTHY
Total size:0 B
Total dirs:2
Total files: 0 (Files currently being written: 1)
Total blocks (validated): 0
Minimally replicated blocks: 0
Hello:
I got this error when putting files into hdfs,it seems a old issue,and I
followed the solution of this link:
31 matches
Mail list logo