currently
> being voted on in the community.
>
> You can read more here:
>
> http://www.cloudera.com/blog/2012/03/high-availability-for-the-hadoop-distributed-file-system-hdfs/
>
> -Todd
>
> On Mon, May 21, 2012 at 11:24 PM, Martinus Martinus
> wrote:
> > Hi,
> >
did you just notice the
> log on the DN after running a job?
>
> On Tue, Jan 31, 2012 at 11:24 AM, Martinus Martinus
> wrote:
> > Hi,
> >
> > I have my hadoop clusters running with 1 master and 6 slaves, and when I
> run
> > bin/hadoop jar hadoop-example
Hi,
I have my hadoop clusters running with 1 master and 6 slaves, and when I
run bin/hadoop jar hadoop-examples-1.0.0.jar wordcount input output, I got
the following error message from my slave datanode logs :
java.net.SocketTimeoutException: 63000 millis timeout while waiting for
channel to be r
, Martinus Martinus wrote:
> Hi Michael/Mohamed,
>
> Thanks for your explanation. How about if I got this on my master node
> jobtracker logs?
>
> http://pastie.org/3211911
>
> what does it means? and I used hadoop to do map/reduce on MongoDB database.
>
> Thanks.
>
data nodes. So I did a fresh install
> like above.
>
>
> Thanks.
>
> On Thu, Jan 19, 2012 at 10:06 AM, Martinus Martinus
> wrote:
> > Hi Mohamed,
> >
> > Thanks for your suggestion, but how do we moved the ownership of our
> hadoop
> > installati
data nodes. So I did a fresh install
> like above.
>
>
> Thanks.
>
> On Thu, Jan 19, 2012 at 10:06 AM, Martinus Martinus
> wrote:
> > Hi Mohamed,
> >
> > Thanks for your suggestion, but how do we moved the ownership of our
> hadoop
> > installati
say...@bibalex.org> wrote:
> Run hadoop using Martinus account, not root account. It will work well
> with you. If there is anything wrong with you, don't hesitate to write it
> here. Thank you.
>
> Mohamed Elsayed
> Bibliotheca Alexandrina
>
>
> On 01/18/2012 01:16 PM, M
m/java-6-sun/bin, you will notice bin/java is a symbolic link to
> jre/bin/java. So both of them are the same. Don't worry about that.
>
> Mohamed Elsayed
> Bibliotheca Alexandrina
>
>
> On 01/18/2012 10:25 AM, Martinus Martinus wrote:
>
> Hi Mohamed,
>
> T
is not necessary to fill.
>
> If there is any error messages appeared after executing any command, try
> to fix or write them here. I will try to help If I can.
>
> Mohamed Elsayed
> Bibliotheca Alexandrina
>
>
> On 01/18/2012 04:28 AM, Martinus Martinus wrote:
>
> Hi
Hi,
I run my hadoop job and when I opened my datanode logs I found many of
these messages :
2012-01-17 23:59:32,756 WARN
org.apache.hadoop.hdfs.server.datanode.DataNode:
java.net.SocketTimeoutException: Call to uvm12dk/172.16.4.147:9000 failed
on socket timeout exception: java.net.SocketTimeoutEx
you faced any hassles, don't
> hesitate to state them here.
>
> Mohamed Elsayed
> Bibliotheca Alexandrina
>
>
> On 01/16/2012 04:52 AM, Martinus Martinus wrote:
>
> Hi Harsh,
>
> I just reinstalled my master node again and it's worked right now. But
> wh
n 13, 2012 at 8:22 PM, Harsh J wrote:
> What does the NameNode log in $HADOOP_HOME/logs/hadoop-*namenode*.log
> carry?
>
> On 13-Jan-2012, at 4:05 PM, Martinus Martinus wrote:
>
> > Hi,
> >
> > I start-all.sh my hadoop master node, but I can't find any namenode
Hi,
I start-all.sh my hadoop master node, but I can't find any namenode on it.
Would anyone be so kindly to tell me how to fix this problem?
Thanks.
hough it is still being worked upon presently.
>>
>> For now, we recommend using multiple ${dfs.name.dir} directories
>> (across mounts), preferably one of them being a reliable-enough NFS
>> point.
>>
>> On Wed, Jan 4, 2012 at 2:26 PM, Martinus Martinus
>>
arath Mundlapudi wrote:
> You might want to check the datanode logs. Go to the 3 remaining nodes
> which didn't start and restart the datanode.
>
> -Bharath
>
>
> On Sun, Jan 1, 2012 at 7:23 PM, Martinus Martinus
> wrote:
>
>> Hi,
>>
>> I have setup
Hi Prashant,
Thanks also for your advice. I have it works right now, I deleted the data
folder inside the hadoop.tmp.dir and I run it again and it's now have total
4 nodes.
Thanks and Happy New Year 2012.
On Mon, Jan 2, 2012 at 2:15 PM, Martinus Martinus wrote:
> Hi Harsh J,
>
&
Could be that you may not have propagated
> configurations properly, or could be that you have a firewall you need to
> turn off/configure, to let the DataNodes communicate with the NameNode.
>
> On 02-Jan-2012, at 8:53 AM, Martinus Martinus wrote:
>
> Hi,
>
> I have setup
Hi,
I have setup a hadoop clusters with 4 nodes and I have start-all.sh and
checked in every node, there are tasktracker and datanode run, but when I
run hadoop dfsadmin -report it's said like this :
Configured Capacity: 30352158720 (28.27 GB)
Present Capacity: 3756392448 (3.5 GB)
DFS Remaining:
:46 WARN mapred.JobClient: Error reading task
outputConnection timed out
11/12/26 12:12:07 WARN mapred.JobClient: Error reading task
outputConnection timed out
Would you be so kindly to tell me how to fix this?
Thanks.
On Mon, Dec 26, 2011 at 10:31 AM, Martinus Martinus
wrote:
> Hi Joey,
>
or the HDFS services and mapred for the
> MapReduce ones.
>
> -Joey
>
> On Fri, Dec 23, 2011 at 4:04 AM, Martinus Martinus
> wrote:
> > Hi Ayon,
> >
> > I tried to setup the hadoop-cluster using hadoop-0.20.2 and it seem's to
> be
> > ok, but when I tried
to the LD_LIBRARY_PATH
> export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server
>
>
> 2011/12/23 Martinus Martinus
>
>> Hi,
>>
>> I tried to setup the hadoop-cluster using hadoop-0.20.2 and it seem's to
>> be ok, but when I tried to used another version of hadoop,
Hi,
I tried to setup the hadoop-cluster using hadoop-0.20.2 and it seem's to be
ok, but when I tried to used another version of hadoop, such as
hadoop-0.20.3, when I start-all.sh, it gaves me an error like this :
uvm12dk: Unrecognized option: -jvm
uvm12dk: Could not create the Java virtual machin
s on Flickr <http://www.flickr.com/photos/ayonsinha/>
> Also check out my Blog for answers to commonly asked
> questions.<http://dailyadvisor.blogspot.com>
>
> --
> *From:* Martinus Martinus
> *To:* hdfs-user@hadoop.apache.org
> *Sent:
Hi,
I have hadoop cluster running and have my data inside mongodb database. I
already write a java code to query data on mongodb using mongodb-java
driver. And right now, I want to use hadoop cluster to run my java code to
get and put the data from and to mongo database. Did anyone has done this
b
24 matches
Mail list logo