15:01:16 INFO mapred.JobClient: Map output records=3435
PASSE !!!
-bash-4.1$ bin/hadoop jar WordCount.jar
=
Thanks in advance for your help.
GYY
From: YIMEN YIMGA Gael ItecCsySat
Sent: Thursday 11 September 2014 12:13
To: 'user@hadoop.apache.org'
Subject: RE: Error when executing
mapred.JobClient: Virtual memory (bytes)
snapshot=11883048960
14/09/15 15:01:16 INFO mapred.JobClient: Map output records=3435
PASSE !!!
-bash-4.1$ bin/hadoop jar WordCount.jar
=
Thanks in advance for your help.
GYY
From: YIMEN YIMGA Gael ItecCsySat
Sent: Thursday 11 September 2014 12
.
// config.set(fs.default.name, hdfs://latdevweb02:9000/);
// config.set(mapred.job.tracker, latdevweb02:9001);
Please could you advise
Standing by...
GYY
From: YIMEN YIMGA Gael ItecCsySat
Sent: Wednesday 10 September 2014 15:10
To: user@hadoop.apache.org
Subject: Error when executing
...@gmail.commailto:skhurana...@gmail.com wrote:
check the log file at ./hadoop/hadoop-datanide-latdevweb02.out (As per ur
last screen shot). There can be various reasons of datanode not starting, the
real issue will be logged into this file.
On Tue, Sep 9, 2014 at 10:06 PM, YIMEN YIMGA Gael
Hello Hadoopers,
Here is the error, I'm facing when running WordCount example program written by
myself.
Kind find attached the file of my WordCount program.
Below the error.
of the
program? For example using hadoop fs -ls command? Also, was this path and files
in it, created by a different user?
The exception seem to say that it does not exist or the running user does not
have permission to read it.
Regards,
Shahab
On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael
to read it.
Regards,
Shahab
On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael
gael.yimen-yi...@sgcib.commailto:gael.yimen-yi...@sgcib.com wrote:
Hello Hadoopers,
Here is the error, I’m facing when running WordCount example program written by
myself.
Kind find attached the file of my
Hello Dear hadoopers,
I hope you are doing well.
I tried to run WordCount.jar file to experience running hadoop jobs. After
launching the program as shown in the screenshot below, I have the message in
the screenshot.
The job tries to connect to the datanode. But failed after 10 attempts, I
passphrase less ssh access to localhost by generating keys
etc?
On Sep 9, 2014 7:18 PM, YIMEN YIMGA Gael
gael.yimen-yi...@sgcib.commailto:gael.yimen-yi...@sgcib.com wrote:
Hello Dear hadoopers,
I hope you are doing well.
I tried to run WordCount.jar file to experience running hadoop jobs. After
: Tuesday 9 September 2014 17:27
To: user@hadoop.apache.org
Subject: Re: Error and problem when running a hadoop job
check whether datanode is started.
On Tue, Sep 9, 2014 at 7:26 PM, YIMEN YIMGA Gael
gael.yimen-yi...@sgcib.commailto:gael.yimen-yi...@sgcib.com wrote:
Yes, all about ssh access
From: YIMEN YIMGA Gael ItecCsySat
Sent: Friday 29 August 2014 12:12
To: 'hdfs-u...@hadoop.apache.org'
Subject: Problem in using Hadoop Eclipse Plugin
Hello Dears,
I did the same, but I'm facing the same error when launching the program
[cid:image001.png@01CFC382.72E72730]
Here is the error
Hello Dears hadoopers :) ,
I'm currently configure Eclipse to work with Hadoop.
I set up a Single node cluster (all the hadoop services are functioning on the
node).
That server is different to my computer where Eclipse is installed.
When I tried to create a new project in Eclipse, the
Hello,
I can share a clue that i used to fix this.
If you could calculate the number of nodes that you’ll need after a year, then
you should make at the startup, a cluster with that number of node. ☺
Warm regards
From: Devaraj K [mailto:deva...@apache.org]
Sent: Tuesday 22 July 2014 16:46
To:
://www.mail-archive.com/search?l=user%40hadoop.apache.orgq=YIMEN+YIMGA+Gael
https://www.mail-archive.com/user%40hadoop.apache.org/msg15411.html
The hadoop mailing is not a hardware shop. The best way to know the price is
ask vendors for quote and/or check their prices on their website.
Regards
Bertrand
Hello All,
I need your experience to evaluate the needs in terms of CPU for a Hadoop
cluster.
How should I do to achieve this properly ?
Warm regards
GYY
*
This message and any attachments (the message) are confidential,
for the data on the cluster.
Oner
9 Tem 2014 19:00 tarihinde YIMEN YIMGA Gael
gael.yimen-yi...@sgcib.commailto:gael.yimen-yi...@sgcib.com yazdı:
Hello Dear,
I made an estimation of a number of nodes of a cluster that can be supplied by
720GB of data/day.
My estimation gave me 367 datanodes
2014-07-09 17:59 GMT+02:00 YIMEN YIMGA Gael
gael.yimen-yi...@sgcib.commailto:gael.yimen-yi...@sgcib.com:
Hello Dear,
I made an estimation of a number of nodes of a cluster that can be supplied by
720GB of data/day.
My estimation gave me 367 datanodes in a year. I’m a bit afraid by that amount
calculate the number of 367 datanodes?
Cheers,
Mirko
2014-07-09 17:59 GMT+02:00 YIMEN YIMGA Gael
gael.yimen-yi...@sgcib.commailto:gael.yimen-yi...@sgcib.com:
Hello Dear,
I made an estimation of a number of nodes of a cluster that can be supplied by
720GB of data/day.
My estimation gave me 367
formats?
Cheers,
Mirko
2014-07-10 10:43 GMT+02:00 YIMEN YIMGA Gael
gael.yimen-yi...@sgcib.commailto:gael.yimen-yi...@sgcib.com:
Hi,
What does « 1.3 for overhead » mean in this calculation ?
Regards
From: Mirko Kämpf [mailto:mirko.kae...@gmail.commailto:mirko.kae...@gmail.com]
Sent: Wednesday 9
In addition, when I applied the Compression factor of 8, I have as daily feeds
: 87GB/day
From: YIMEN YIMGA Gael ItecCsySat
Sent: Thursday 10 July 2014 11:11
To: user@hadoop.apache.org
Subject: RE: Need to evaluate a cluster
Thank for your return Mirko,
In my case, I can consider compression
disks or your spend your money
on cooling / power consumption and potentially building a new DC ;).
A typical server from a tier 1 vendor ( HP, Dell, IBM, Cisco ) should be around
5k euros ( fully loaded with HDD ).
Kind regards,
Olivier
On 10 July 2014 11:10, YIMEN YIMGA Gael
gael.yimen-yi
on the long run.
Have a look into the book:
http://www.amazon.de/Hadoop-Operations-Eric-Sammer/dp/1449327052
Cheers,
Mirko
2014-07-10 11:10 GMT+02:00 YIMEN YIMGA Gael
gael.yimen-yi...@sgcib.commailto:gael.yimen-yi...@sgcib.com:
Thank for your return Mirko,
In my case, I can consider compression factor
Hello Dear,
I made an estimation of a number of nodes of a cluster that can be supplied by
720GB of data/day.
My estimation gave me 367 datanodes in a year. I'm a bit afraid by that amount
of datanodes.
The assumptions, I used are the followings :
- Daily supply (feed) : 720GB
-
Hello Dear all,
I would like to evaluate the price of a Hadoop cluster using the below
characteristics for my Namenode and for my Datanode.
My cluster should have one Namenode and three datanode.
Could someone help me with the price of commodity hardware with these
characteristics, please ?
: Cristobal Giadach [mailto:cgiada...@gmail.com]
Sent: Thursday 3 July 2014 17:32
To: user@hadoop.apache.org
Subject: Re: Need to evaluate the price of a Hadoop cluster
Are you using Hadoop 2.x?
What about your secondary namenode?
El jul 3, 2014 11:19 AM, YIMEN YIMGA Gael
gael.yimen-yi
25 matches
Mail list logo