Hi all ,
I am interested in the exploring the test folder . which is present in
src/test/org/apache/hadoop/hdfs/*
Please can some one give me the command line syntax to run those programs ?
and building the code by ant command doesnt build the test folder by default
. Is there any place(buil
On Tue, Dec 1, 2009 at 7:34 PM, pavel kolodin
wrote:
> had...@hadoopmaster ~ $ ls -l hadoop-0.20.1
> total 4876
> -rw-rw-r-- 1 hadoop hadoop 344093 Sep 1 20:44 CHANGES.txt
> -rw-rw-r-- 1 hadoop hadoop 13366 Sep 1 20:44 LICENSE.txt
> -rw-rw-r-- 1 hadoop hadoop 101 Sep 1 20:44 NOTICE.tx
On Tue, Dec 1, 2009 at 7:17 PM, pavel kolodin
wrote:
>
> Anyways, as per HADOOP-2659, dfsadmin are only meant for admin. Thanks!
>>
>
> Who is "admin" for hadoop? (-; I have 2 users in OS: "hadoop" + "root" and
> i am running all commands as "hadoop" (-;
>
Can you tell us the "ENTIRE command" yo
On Tue, Dec 1, 2009 at 2:55 AM, pavel kolodin wrote:
>
> Namenode won't start with this messages:
>
> hadoop-0.20.1/logs/hadoop-hadoop-namenode-hadoop_master.log:
>
> http://pastebin.com/m359b9e24
>
look carefully at below lines . They could be of some help
2009-11-30 16:57:36,102 ERROR
org.ap
y of California, Santa Cruz
>
>
> On Sat, Nov 28, 2009 at 11:20 PM, Siddu wrote:
>
> > Hi all,
> >
> > I am interested to see the each and every call trace if i issue a command
> >
> > for ex : $bin/hadoop dfs -copyFromLocal /tmp/file.txt
> /user/hadoop
Hi all,
I am interested to see the each and every call trace if i issue a command
for ex : $bin/hadoop dfs -copyFromLocal /tmp/file.txt /user/hadoop/file.txt
or while running a M/R job .
Is there any command to sprinkle the logs at the begin of each and every
function and build the source
On Sun, Nov 29, 2009 at 5:41 AM, dzisaacs wrote:
>
> why is this folder called "server"
>
The NameNode and DataNode component basically act as the server . Hence
the name is my guess !
>
> what is this folder for?
> --
> View this message in context:
> http://old.nabble.com/hadoop-0.20.1%5C
On Tue, Nov 24, 2009 at 6:10 PM, Siddu wrote:
>
>
> On Tue, Nov 24, 2009 at 6:01 PM, Steve Loughran wrote:
>
>> Siddu wrote:
>>
>>> On Thu, Nov 12, 2009 at 11:50 PM, Stephen Watt wrote:
>>>
>>> Hi Sid
>>>>
>>>> Check o
On Fri, Nov 27, 2009 at 6:28 AM, Mark Kerzner wrote:
> Hi,
>
> it is probably described somewhere in the manuals, but
>
>
> 1. Where are the log files, especially those that show my
> System.out.println() and errors; and
>
Look at the logs directory ...
> 2. Do I need to log in to every ma
On Thu, Nov 26, 2009 at 5:32 PM, Cubic wrote:
> Hi list.
>
> I have small files containing data that has to be processed. A file
> can be small, even down to 10MB (but it can me also 100-600MB large)
> and contains at least 3 records to be processed.
> Processing one record can take 30 second
behavior is unchanged.
>Switching from one parameter value to the other does not change the
> mode,
>owner or group of files or directories.
>
>
>
> On Fri, Nov 20, 2009 at 1:46 AM, Jeff Zhang wrote:
>
> > On Fri, Nov 20, 2009 at 4:39 PM, Siddu wrote:
>
On Tue, Nov 24, 2009 at 6:01 PM, Steve Loughran wrote:
> Siddu wrote:
>
>> On Thu, Nov 12, 2009 at 11:50 PM, Stephen Watt wrote:
>>
>> Hi Sid
>>>
>>> Check out the "Building" section in this link -
>>> http://wiki.apache.org/hado
resolve] :: resolving dependencies ::
org.apache.hadoop#Hadoop;work...@sushanth-laptop
[ivy:resolve] confs: [common]
Please help me out !
> Kind regards
> Steve Watt
>
>
>
> From:
> Siddu
> To:
> common-user@hadoop.apache.org
> Date:
> 11/12/2009 12:14 PM
> Subject
. It just one machine am experimenting
on (pseudo distributed file system)
then you can run dfs shell command in the client's machine
>
>
> Jeff Zhang
>
>
>
> On Fri, Nov 20, 2009 at 3:38 PM, Siddu wrote:
>
> > Hello all,
> >
> > I am not sure if the qu
Hello all,
I am not sure if the question is framed right !
Lets say user1 launches an instance of hadoop on *single node* , and hence
he has permission to create,delete files on hdfs or launch M/R jobs .
now what should i do if user2 wants to use the same instance of hadoop which
is launched by
On Fri, Nov 13, 2009 at 6:05 PM, radar.sxl wrote:
>
> When I run a hadoop project with Eclipse-plugin or Cygwin, it's OK. But
> when
> I run in Hadoop cluster, it's error. So strange questions, I do not kown
> why!
>
> OK:
> 09/11/13 15:03:59 INFO mapred.JobClient: Running job: job_200911131503_0
On Sun, Nov 15, 2009 at 1:03 AM, Mike Kendall wrote:
> for some reason i never tried lowering my number of map and reduce tasks
> until now. looks like i need to reconfigure my cluster since it runs fine
> with only 3 map tasks and 3 reduce tasks.
>
> :X
>
> On Sat, Nov 14, 2009 at 11:22 AM, Mik
Hi all,
I want to build hadoop from source rather than downloading the already
built tar ball.
Can someone please give me the steps or link to any pointers please
Thanks in advance
--
Regards,
~Sid~
I have never met a man so ignorant that i couldn't learn something from him
Hi all
In
src/contrib/data_join/src/java/org/apache/hadoop/contrib/utils/join/DataJoinJob.java
i found a couple of println statements (shown below )which are getting executed
when submitted for a job .
I am not sure to which stdout they are printing ?
I searched in logs/* but dint find it ?
2009/11/8 Gang Luo :
> Hi everyone,
> To check whether my hadoop program goes as I expected, I add some "println"
> in my program. But it seems they don't work. Somebody gives me some
> suggestion how to output something to stdout? Thanks.
>
look out for a folder called
logs/userlogs//attempt_20
in hadoop
> 7) Document classification
> 8) Document Ranking
>
> Infact all batch applications that can be parallelised are suitable for
hadoop.
> G Sudha Sadasivam
>
>
>
> --- On Wed, 10/14/09, Siddu wrote:
>
>
> From: Siddu
> Subject: Project ideas !
> T
Hello Hadoop Users,
Me and another friend of mine are looking out for some of the project ideas
based on hadoop
as a part of our curriculum .
Can you give us some pointers please
Thanks in advance !
Regards,
~Sid~
22 matches
Mail list logo