Hi Kiran, Does hdfs://localhost:9000/user/hduser/pig/planet.osm exist on your HDFS?
[image: http://] Tariq, Mohammad about.me/mti [image: http://] <http://about.me/mti> On Wed, Jun 22, 2016 at 1:08 PM, Kirandeep Kaur <kiranmatharo...@gmail.com> wrote: > Dear Sir > > I am working on pig with hadoop as a researcher. I have some problem, > following is the code and it is not able to read the input file from hdfs > in hadoop. > > grunt> x_nodes = LOAD 'hdfs://localhost:50075/user/hduser/pig/planet.osm' > USING org.apache.pig.piggybank.storage.XMLLoader('node') AS > (node:chararray); grunt> p_nodes = FOREACH x_nodes GENERATE OSMNode(node) > as node; grunt> p_nodes = FILTER p_nodes BY > ST_Contains(ST_MakeBox(75.48,30.61,76.30,31.14), ST_MakePoint(node.lon, > node.lat)); grunt> STORE p_nodes INTO 'p_nodes'; > > HadoopVersion PigVersion UserId StartedAt FinishedAt Features 1.2.1 0.15.0 > hduser 2016-06-22 12:38:13 2016-06-22 12:38:15 UNKNOWN > > Failed! > > Failed Jobs: JobId Alias Feature Message Outputs N/A > macro_LoadOSMNodes_osm_nodes_3,macro_LoadOSMNodes_xml_nodes_3,p_nodes > MAP_ONLY Message: org.apache.pig.backend.executionengine.ExecException: > ERROR 2118: Input path does not exist: > hdfs://localhost:9000/user/hduser/pig/planet.osm at > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:279) > at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1054) > at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1071) at > org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179) at > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983) at > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936) at > java.security.AccessController.doPrivileged(Native Method) at > javax.security.auth.Subject.doAs(Subject.java:422) at > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190) > at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936) > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910) at > org.apache.hadoop.mapred.jobcontrol.Job.submit(Job.java:378) at > > org.apache.hadoop.mapred.jobcontrol.JobControl.startReadyJobs(JobControl.java:247) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) at > > org.apache.pig.backend.hadoop20.PigJobControl.mainLoopAction(PigJobControl.java:157) > at > org.apache.pig.backend.hadoop20.PigJobControl.run(PigJobControl.java:134) > at java.lang.Thread.run(Thread.java:745) at > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276) > Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: > Input path does not exist: hdfs://localhost:9000/user/hduser/pig/planet.osm > at > > org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235) > at > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigTextInputFormat.listStatus(PigTextInputFormat.java:36) > at > > org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252) > at > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:265) > ... 20 more hdfs://localhost:9000/user/hduser/p_nodes-out, > > Input(s): Failed to read data from > "hdfs://localhost:9000/user/hduser/pig/planet.osm" > > Output(s): Failed to produce result in > "hdfs://localhost:9000/user/hduser/p_nodes-out" > > Counters: Total records written : 0 Total bytes written : 0 Spillable > Memory Manager spill count : 0 Total bags proactively spilled: 0 Total > records proactively spilled: 0 > > Job DAG: null > > 2016-06-22 12:38:15,105 [main] INFO > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher > - Failed! > Please provide solution where is wrong declaration is used asap. > > Thanks > > Kiran >