what is your input data like?
On Apr 2, 2014 10:16 AM, "ei09072" <ei09...@fe.up.pt> wrote:

> After installing Hadoop 2.3.0 on windows 8, I tried to run the wordcount
> example given. However I get the following error:
>
>
> c:\hadoop>bin\yarn jar share/hadoop/mapreduce/hadoop-
> mapreduce-examples-2.3.0.ja
> r wordcount /input output
> 14/03/26 14:20:48 INFO client.RMProxy: Connecting to ResourceManager at /
> 0.0.0.0
> :8032
> 14/03/26 14:20:50 INFO input.FileInputFormat: Total input paths to process
> : 1
> 14/03/26 14:20:51 INFO mapreduce.JobSubmitter: number of splits:1
> 14/03/26 14:20:51 INFO Configuration.deprecation: user.name is
> deprecated. Inste
> ad, use mapreduce.job.user.name
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapred.jar is
> deprecated. Inst
> ead, use mapreduce.job.jar
> 14/03/26 14:20:51 INFO Configuration.deprecation:
> mapred.output.value.class is d
> eprecated. Instead, use mapreduce.job.output.value.class
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapreduce.combine.class
> is dep
> recated. Instead, use mapreduce.job.combine.class
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapreduce.map.class is
> depreca
> ted. Instead, use mapreduce.job.map.class
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapred.job.name is
> deprecated.
>  Instead, use mapreduce.job.name
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapreduce.reduce.class
> is depr
> ecated. Instead, use mapreduce.job.reduce.class
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapred.input.dir is
> deprecated
> . Instead, use mapreduce.input.fileinputformat.inputdir
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapred.output.dir is
> deprecate
> d. Instead, use mapreduce.output.fileoutputformat.outputdir
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapred.map.tasks is
> deprecated
> . Instead, use mapreduce.job.maps
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapred.output.key.class
> is dep
> recated. Instead, use mapreduce.job.output.key.class
> 14/03/26 14:20:51 INFO Configuration.deprecation: mapred.working.dir is
> deprecat
> ed. Instead, use mapreduce.job.working.dir
> 14/03/26 14:20:51 INFO mapreduce.JobSubmitter: Submitting tokens for job:
> job_13
> 95833928952_0004
> 14/03/26 14:20:52 INFO impl.YarnClientImpl: Submitted application
> application_13
> 95833928952_0004 to ResourceManager at /0.0.0.0:8032
> 14/03/26 14:20:52 INFO mapreduce.Job: The url to track the job:
> http://teste:8088/proxy/application_1395833928952_0004/
> 14/03/26 14:20:52 INFO mapreduce.Job: Running job: job_1395833928952_0004
> 14/03/26 14:21:08 INFO mapreduce.Job: Job job_1395833928952_0004 running
> in uber
>  mode : false
> 14/03/26 14:21:08 INFO mapreduce.Job:  map 0% reduce 0%
> 14/03/26 14:21:20 INFO mapreduce.Job: Task Id :
> attempt_1395833928952_0004_m_000
> 000_0, Status : FAILED
> Error: java.lang.ClassCastException: org.apache.hadoop.mapreduce.
> lib.input.FileS
> plit cannot be cast to org.apache.hadoop.mapred.InputSplit
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:402)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInforma
> tion.java:1491)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> 14/03/26 14:21:33 INFO mapreduce.Job: Task Id :
> attempt_1395833928952_0004_m_000
> 000_1, Status : FAILED
> Error: java.lang.ClassCastException: org.apache.hadoop.mapreduce.
> lib.input.FileS
> plit cannot be cast to org.apache.hadoop.mapred.InputSplit
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:402)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInforma
> tion.java:1491)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> 14/03/26 14:21:48 INFO mapreduce.Job: Task Id :
> attempt_1395833928952_0004_m_000
> 000_2, Status : FAILED
> Error: java.lang.ClassCastException: org.apache.hadoop.mapreduce.
> lib.input.FileS
> plit cannot be cast to org.apache.hadoop.mapred.InputSplit
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:402)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInforma
> tion.java:1491)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> 14/03/26 14:22:04 INFO mapreduce.Job:  map 100% reduce 100%
> 14/03/26 14:22:10 INFO mapreduce.Job: Job job_1395833928952_0004 failed
> with sta
> te FAILED due to: Task failed task_1395833928952_0004_m_000000
> Job failed as tasks failed. failedMaps:1 failedReduces:0
> 14/03/26 14:22:10 INFO mapreduce.Job: Counters: 6
>         Job Counters
>                 Failed map tasks=4
>                 Launched map tasks=4
>                 Other local map tasks=3
>                 Data-local map tasks=1
>                 Total time spent by all maps in occupied slots (ms)=48786
>                 Total time spent by all reduces in occupied slots (ms)=0
>
>
>
>
> I then tried to run my own wordcount with the current api and the same
> error occurs, I also tried installing Hadoop 2.2.0 and yet the same error
> happens, so I have no idea why this API error occurs. I have seen this
> error mentioned in several places but with no solution given so I decided
> to come here to ask since I am out of ideas on how to solve it.
>

Reply via email to