On Tue, Dec 1, 2009 at 7:17 PM, pavel kolodin
pavelkolodinhad...@gmail.comwrote:
Anyways, as per HADOOP-2659, dfsadmin are only meant for admin. Thanks!
Who is admin for hadoop? (-; I have 2 users in OS: hadoop + root and
i am running all commands as hadoop (-;
Can you tell us the ENTIRE
On Tue, Dec 1, 2009 at 7:34 PM, pavel kolodin
pavelkolodinhad...@gmail.comwrote:
had...@hadoopmaster ~ $ ls -l hadoop-0.20.1
total 4876
-rw-rw-r-- 1 hadoop hadoop 344093 Sep 1 20:44 CHANGES.txt
-rw-rw-r-- 1 hadoop hadoop 13366 Sep 1 20:44 LICENSE.txt
-rw-rw-r-- 1 hadoop hadoop
Hi all ,
I am interested in the exploring the test folder . which is present in
src/test/org/apache/hadoop/hdfs/*
Please can some one give me the command line syntax to run those programs ?
and building the code by ant command doesnt build the test folder by default
. Is there any
Cruz
On Sat, Nov 28, 2009 at 11:20 PM, Siddu siddu.s...@gmail.com wrote:
Hi all,
I am interested to see the each and every call trace if i issue a command
for ex : $bin/hadoop dfs -copyFromLocal /tmp/file.txt
/user/hadoop/file.txt
or while running a M/R job .
Is there any
Hi all,
I am interested to see the each and every call trace if i issue a command
for ex : $bin/hadoop dfs -copyFromLocal /tmp/file.txt /user/hadoop/file.txt
or while running a M/R job .
Is there any command to sprinkle the logs at the begin of each and every
function and build the source
On Thu, Nov 26, 2009 at 5:32 PM, Cubic cubicdes...@gmail.com wrote:
Hi list.
I have small files containing data that has to be processed. A file
can be small, even down to 10MB (but it can me also 100-600MB large)
and contains at least 3 records to be processed.
Processing one record
On Fri, Nov 27, 2009 at 6:28 AM, Mark Kerzner markkerz...@gmail.com wrote:
Hi,
it is probably described somewhere in the manuals, but
1. Where are the log files, especially those that show my
System.out.println() and errors; and
Look at the logs directory ...
2. Do I need to log
On Tue, Nov 24, 2009 at 6:01 PM, Steve Loughran ste...@apache.org wrote:
Siddu wrote:
On Thu, Nov 12, 2009 at 11:50 PM, Stephen Watt sw...@us.ibm.com wrote:
Hi Sid
Check out the Building section in this link -
http://wiki.apache.org/hadoop/HowToRelease . Its pretty straight
forward
behavior is unchanged.
Switching from one parameter value to the other does not change the
mode,
owner or group of files or directories.
/description
/property
On Fri, Nov 20, 2009 at 1:46 AM, Jeff Zhang zjf...@gmail.com wrote:
On Fri, Nov 20, 2009 at 4:39 PM, Siddu siddu.s
as such . It just one machine am experimenting
on (pseudo distributed file system)
then you can run dfs shell command in the client's machine
Jeff Zhang
On Fri, Nov 20, 2009 at 3:38 PM, Siddu siddu.s...@gmail.com wrote:
Hello all,
I am not sure if the question is framed right !
Lets say
Hello all,
I am not sure if the question is framed right !
Lets say user1 launches an instance of hadoop on *single node* , and hence
he has permission to create,delete files on hdfs or launch M/R jobs .
now what should i do if user2 wants to use the same instance of hadoop which
is launched by
On Fri, Nov 13, 2009 at 6:05 PM, radar.sxl radar@gmail.com wrote:
When I run a hadoop project with Eclipse-plugin or Cygwin, it's OK. But
when
I run in Hadoop cluster, it's error. So strange questions, I do not kown
why!
OK:
09/11/13 15:03:59 INFO mapred.JobClient: Running job:
Hi all
In
src/contrib/data_join/src/java/org/apache/hadoop/contrib/utils/join/DataJoinJob.java
i found a couple of println statements (shown below )which are getting executed
when submitted for a job .
I am not sure to which stdout they are printing ?
I searched in logs/* but dint find it ?
2009/11/8 Gang Luo lgpub...@yahoo.com.cn:
Hi everyone,
To check whether my hadoop program goes as I expected, I add some println
in my program. But it seems they don't work. Somebody gives me some
suggestion how to output something to stdout? Thanks.
look out for a folder called
in hadoop
7) Document classification
8) Document Ranking
Infact all batch applications that can be parallelised are suitable for
hadoop.
G Sudha Sadasivam
--- On Wed, 10/14/09, Siddu siddu.s...@gmail.com wrote:
From: Siddu siddu.s...@gmail.com
Subject: Project ideas !
To: common-user
15 matches
Mail list logo