Is the error thrown as described bellow caught by the job client and translated into a user-friendly message?

On Dec 8, 2006, at 4:22 AM, Sanjay Dahiya (JIRA) wrote:

     [ http://issues.apache.org/jira/browse/HADOOP-476?page=all ]

Sanjay Dahiya updated HADOOP-476:
---------------------------------

    Attachment: Hadoop-476.patch

Updated patch, now it doesnt check for -mapper/reducer/combiner on localdisk of submit node. it runs chmod 0777 on -files and if after that files is not readable/writable/exists it throws error.


Streaming should check for correctness of the task
--------------------------------------------------

                Key: HADOOP-476
                URL: http://issues.apache.org/jira/browse/HADOOP-476
            Project: Hadoop
         Issue Type: Bug
         Components: contrib/streaming
           Reporter: arkady borkovsky
        Assigned To: Sanjay Dahiya
        Attachments: Hadoop-476.patch, Hadoop-476.patch


Currently, if anythin is wrong with streaming job, it dies without any explanation. Before creating and running actual MapReduce job, Streaming should check if: -- the executables (or scripts) for -mapper and -reducer are available and have right permissions
-- the input fragments exist

--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira



Reply via email to