Hi Martinus,

I am not sure what causes this exception. Do you think you could attach namenode's and datanode's log file? They might lead us to the cause of exception. I can't say there is a problem in connectivity between namenode and datanodes, because you said the file exists in datanodes!

Mohamed Elsayed
Bibliotheca Alexandrina

On 01/30/2012 11:55 AM, Martinus Martinus wrote:
Hi Mohamed,

I got this when I tried to bin/hadoop dfs -put single.txt input/ :

12/01/30 17:52:40 INFO hdfs.DFSClient: Exception in createBlockOutputStream 172.16.4.85:50010 <http://172.16.4.85:50010> java.io.IOException: Bad connect ack with firstBadLink as 172.16.2.130:50010 <http://172.16.2.130:50010> 12/01/30 17:52:40 INFO hdfs.DFSClient: Abandoning block blk_5493791950927576696_1004 12/01/30 17:52:40 INFO hdfs.DFSClient: Excluding datanode 172.16.2.130:50010 <http://172.16.2.130:50010>

but when I checked on each node using bin/hadoop dfs -lsr, there is single.txt inside the input/ folder. How can I get rid those exception?

Thanks.

On Thu, Jan 19, 2012 at 2:55 PM, Martinus Martinus <martinus...@gmail.com <mailto:martinus...@gmail.com>> wrote:

    Hi Michael/Mohamed,

    Thanks for your explanation. How about if I got this on my master
    node jobtracker logs?

    http://pastie.org/3211911

    what does it means? and I used hadoop to do map/reduce on MongoDB
    database.

    Thanks.


    On Thu, Jan 19, 2012 at 10:17 AM, Michael Lok <fula...@gmail.com
    <mailto:fula...@gmail.com>> wrote:

        Hi Martinus,

        While you can run the two instances of Hadoop on the same machine,
        you'll need to reconfigure the listening ports for MR/HDFS and the
        webapps to run on different ports.

        The easier way to Hadoop as a different user is:

        - stop current Hadoop instance
        - su as the intended user
        - re-install Hadoop and reformat a namenode under the same user.

        I've tried to perform chown to my existing Hadoop
        installation.  It
        had some permission problems; but didn't really bother to
        troubleshoot
        as I didn't have data in the data nodes.  So I did a fresh install
        like above.


        Thanks.

        On Thu, Jan 19, 2012 at 10:06 AM, Martinus Martinus
        <martinus...@gmail.com <mailto:martinus...@gmail.com>> wrote:
        > Hi Mohamed,
        >
        > Thanks for your suggestion, but how do we moved the
        ownership of our hadoop
        > installation from root to another user account? and can we
        install two
        > different versions of hadoop on the same machine?
        >
        > Thanks.
        >
        >
        > On Wed, Jan 18, 2012 at 8:30 PM, Mohamed Elsayed
        > <mohammed.elsay...@bibalex.org
        <mailto:mohammed.elsay...@bibalex.org>> wrote:
        >>
        >> Run hadoop using Martinus account, not root account. It
        will work well
        >> with you. If there is anything wrong with you, don't
        hesitate to write it
        >> here. Thank you.
        >>
        >> Mohamed Elsayed
        >> Bibliotheca Alexandrina
        >>
        >>
        >> On 01/18/2012 01:16 PM, Martinus Martinus wrote:
        >>
        >> Hi Mohamed,
        >>
        >> Please find attached the namenode log.
        >>
        >> Thanks.
        >>
        >> On Wed, Jan 18, 2012 at 7:02 PM, Mohamed Elsayed
        >> <mohammed.elsay...@bibalex.org
        <mailto:mohammed.elsay...@bibalex.org>> wrote:
        >>>
        >>> Hi Martinus,
        >>>
        >>> Do you think you could attach namenode's log file to find
        out where is
        >>> the problem? Thank you.
        >>>
        >>> Mohamed Elsayed
        >>> Bibliotheca Alexandrina
        >>>
        >>>
        >>> On 01/18/2012 11:14 AM, Martinus Martinus wrote:
        >>>
        >>> Hi Mohamed,
        >>>
        >>> Thanks for your explanation. I still have the same problem
        as before. do
        >>> you have any other suggestion?
        >>>
        >>> Thanks.
        >>>
        >>> On Wed, Jan 18, 2012 at 4:52 PM, Mohamed Elsayed
        >>> <mohammed.elsay...@bibalex.org
        <mailto:mohammed.elsay...@bibalex.org>> wrote:
        >>>>
        >>>> The core is locating bin/java. If you execute ls -l
        >>>> /usr/lib/jvm/java-6-sun/bin, you will notice bin/java is
        a symbolic link to
        >>>> jre/bin/java. So both of them are the same. Don't worry
        about that.
        >>>>
        >>>> Mohamed Elsayed
        >>>> Bibliotheca Alexandrina
        >>>>
        >>>>
        >>>> On 01/18/2012 10:25 AM, Martinus Martinus wrote:
        >>>>
        >>>> Hi Mohamed,
        >>>>
        >>>> Thanks for your answer. I have setup it before into
        something like this
        >>>> :
        >>>>
        >>>> JAVA_HOME=/usr/lib/jvm/java-6-sun
        >>>>
        >>>> I don't know if that's should be the same or will it gave
        a different
        >>>> result. I'll try it first.
        >>>>
        >>>> Thanks.
        >>>>
        >>>> On Wed, Jan 18, 2012 at 3:53 PM, Mohamed Elsayed
        >>>> <mohammed.elsay...@bibalex.org
        <mailto:mohammed.elsay...@bibalex.org>> wrote:
        >>>>>
        >>>>> Hi Martinus,
        >>>>>
        >>>>> As I remember, I started again from scratch and erased
        everything
        >>>>> related to hadoop in my machine. In my first run, there
        was an error message
        >>>>> (related to java) displayed after executing bin/hadoop
        namenode -format.
        >>>>> This message didn't appear in my second run. So I
        believe your problem is in
        >>>>> this point. Keep you eyes on what is happening after
        executing this command.
        >>>>> FYI, you must set JAVA_HOME in conf/hadoop-env.sh by
        doing the following:
        >>>>>
        >>>>> file `which java`
        >>>>> file /etc/alternatives/java
        >>>>> JAVA_HOME=/usr/lib/jvm/java-6-openjdk/jre
        >>>>>
        >>>>> If you still can't run the example, try to set
        HADOOP_CLASSPATH in
        >>>>> conf/hadoop-env.sh to the example's path e.g.
        >>>>>
        HADOOP_CLASSPATH=../hadoop-0.20.203.0/hadoop-examples-0.20.203.0.jar.
        But
        >>>>> this variable is not necessary to fill.
        >>>>>
        >>>>> If there is any error messages appeared after executing
        any command,
        >>>>> try to fix or write them here. I will try to help If I can.
        >>>>>
        >>>>> Mohamed Elsayed
        >>>>> Bibliotheca Alexandrina
        >>>>>
        >>>>>
        >>>>> On 01/18/2012 04:28 AM, Martinus Martinus wrote:
        >>>>>
        >>>>> Hi Mohamed,
        >>>>>
        >>>>> Did you found out how to handle this thing? Because when
        I run the
        >>>>> hadoop-0.20.2-example.jar wordcount input output it just
        run, but it didn't
        >>>>> gave any log message and it won't stop running.
        >>>>>
        >>>>> Thanks.
        >>>>>
        >>>>> On Tue, Jan 17, 2012 at 7:29 PM, Mohamed Elsayed
        >>>>> <mohammed.elsay...@bibalex.org
        <mailto:mohammed.elsay...@bibalex.org>> wrote:
        >>>>>>
        >>>>>> Your case happened to me in the past. In first try to
        start again from
        >>>>>> scratch and be sure that everything is going well
        (working well e.g.
        >>>>>> namenode is working and there is no java error)
        specifically after executing
        >>>>>> bin/hadoop namenode -format. If you faced any hassles,
        don't hesitate to
        >>>>>> state them here.
        >>>>>>
        >>>>>> Mohamed Elsayed
        >>>>>> Bibliotheca Alexandrina
        >>>>>>
        >>>>>>
        >>>>>> On 01/16/2012 04:52 AM, Martinus Martinus wrote:
        >>>>>>
        >>>>>> Hi Harsh,
        >>>>>>
        >>>>>> I just reinstalled my master node again and it's worked
        right now. But
        >>>>>> when I tried to run the example using command
        bin/hadoop jar
        >>>>>> hadoop-0.20.2-examples.jar wordcount input output, it
        just waiting and
        >>>>>> didn't gave me anything and didn't stop either.
        >>>>>>
        >>>>>> Thanks.
        >>>>>>
        >>>>>> On Fri, Jan 13, 2012 at 8:22 PM, Harsh J
        <ha...@cloudera.com <mailto:ha...@cloudera.com>> wrote:
        >>>>>>>
        >>>>>>> What does the NameNode log in
        $HADOOP_HOME/logs/hadoop-*namenode*.log
        >>>>>>> carry?
        >>>>>>>
        >>>>>>> On 13-Jan-2012, at 4:05 PM, Martinus Martinus wrote:
        >>>>>>>
        >>>>>>> > Hi,
        >>>>>>> >
        >>>>>>> > I start-all.sh my hadoop master node, but I can't
        find any namenode
        >>>>>>> > on it. Would anyone be so kindly to tell me how to
        fix this problem?
        >>>>>>> >
        >>>>>>> > Thanks.
        >>>>>>>
        >>>>>>
        >>>>>
        >>>>
        >>>
        >>
        >



Reply via email to