[
https://issues.apache.org/jira/browse/HADOOP-3650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Ramya R resolved HADOOP-3650.
-----------------------------
Resolution: Cannot Reproduce
Cannot be reproduced. Hence closing the issue.
> Unit test TestMiniMRDFSSort.testMapReduceSort fails on windows with a timeout
> -----------------------------------------------------------------------------
>
> Key: HADOOP-3650
> URL: https://issues.apache.org/jira/browse/HADOOP-3650
> Project: Hadoop Core
> Issue Type: Bug
> Components: mapred
> Affects Versions: 0.19.0
> Environment: windows
> Reporter: Mukund Madhugiri
> Priority: Critical
>
> Unit test TestMiniMRDFSSort.testMapReduceSort fails on windows with a timeout
> I see this on the console:
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:29:19,435 WARN
> mapred.ReduceTask (ReduceTask.java:run(928)) -
> attempt_200806252017_0003_r_000000_0 copy failed:
> attempt_200806252017_0003_m_000002_0 from localhost
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:29:19,435
> WARN mapred.ReduceTask (ReduceTask.java:run(930)) -
> java.net.SocketTimeoutException: Read timed out
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> [junit] attempt_200806252017_0003_r_000000_0: at
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.net.www.protocol.http.HttpURLConnection$6.run(HttpURLConnection.java:1225)
> [junit] attempt_200806252017_0003_r_000000_0: at
> java.security.AccessController.doPrivileged(Native Method)
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1219)
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:906)
> [junit] attempt_200806252017_0003_r_000000_0: at
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$MapOutputCopier.getInputStream(ReduceTask.java:1217)
> [junit] attempt_200806252017_0003_r_000000_0: at
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$MapOutputCopier.getMapOutput(ReduceTask.java:1067)
> [junit] attempt_200806252017_0003_r_000000_0: at
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$MapOutputCopier.copyOutput(ReduceTask.java:976)
> [junit] attempt_200806252017_0003_r_000000_0: at
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$MapOutputCopier.run(ReduceTask.java:925)
> [junit] attempt_200806252017_0003_r_000000_0: Caused by:
> java.net.SocketTimeoutException: Read timed out
> [junit] attempt_200806252017_0003_r_000000_0: at
> java.net.SocketInputStream.socketRead0(Native Method)
> [junit] attempt_200806252017_0003_r_000000_0: at
> java.net.SocketInputStream.read(SocketInputStream.java:129)
> [junit] attempt_200806252017_0003_r_000000_0: at
> java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
> [junit] attempt_200806252017_0003_r_000000_0: at
> java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> [junit] attempt_200806252017_0003_r_000000_0: at
> java.io.BufferedInputStream.read(BufferedInputStream.java:313)
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:681)
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:626)
> [junit] attempt_200806252017_0003_r_000000_0: at
> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:957)
> [junit] attempt_200806252017_0003_r_000000_0: ... 4 more
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:29:20,154
> INFO mapred.ReduceTask (ReduceTask.java:fetchOutputs(1564)) - Task
> attempt_200806252017_0003_r_000000_0: Failed fetch #2 from
> attempt_200806252017_0003_m_000002_0
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:29:20,154
> INFO mapred.ReduceTask (ReduceTask.java:fetchOutputs(1575)) - Failed to
> fetch map-output from attempt_200806252017_0003_m_000002_0 even after
> MAX_FETCH_RETRIES_PER_MAP retries... reporting to the JobTracker
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:29:20,154
> WARN mapred.ReduceTask (ReduceTask.java:fetchOutputs(1636)) -
> attempt_200806252017_0003_r_000000_0 adding host localhost to penalty box,
> next contact in 8 seconds
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:29:21,201
> INFO mapred.ReduceTask (ReduceTask.java:fetchOutputs(1427)) -
> attempt_200806252017_0003_r_000000_0: Got 5 map-outputs from previous failures
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:29:31,216
> INFO mapred.ReduceTask (ReduceTask.java:fetchOutputs(1486)) -
> attempt_200806252017_0003_r_000000_0 Scheduled 1 of 5 known outputs (0 slow
> hosts and 4 dup hosts)
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:30:08,482
> INFO mapred.ReduceTask (ReduceTask.java:fetchOutputs(1390)) -
> attempt_200806252017_0003_r_000000_0 Need another 20 map output(s) where 1 is
> already in progress
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:30:09,451
> INFO mapred.ReduceTask (ReduceTask.java:fetchOutputs(1413)) -
> attempt_200806252017_0003_r_000000_0: Got 0 new map-outputs & number of known
> map outputs is 4
> [junit] attempt_200806252017_0003_r_000000_0: 2008-06-25 20:30:09,451
> INFO mapred.ReduceTask (ReduceTask.java:fetchOutputs(1486)) -
> attempt_200806252017_0003_r_000000_0 Scheduled 0 of 4 known outputs (0 slow
> hosts and 4 dup hosts)
> [junit] 2008-06-25 20:31:00,779 INFO mapred.JobClient
> (JobClient.java:runJob(1037)) - Task Id :
> attempt_200806252017_0003_m_000002_0, Status : KILLED
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.