my eclipse and pig are in same linux. 
this is my pig configuration in  eclipse:
 props.setProperty("fs.defaultFS", "hdfs://10.210.90.101:8020");
     props.setProperty("hadoop.job.user", "hadoop");
     props.setProperty("mapreduce.framework.name", "yarn");
     props.setProperty("yarn.resourcemanager.hostname", "10.210.90.101");
     props.setProperty("yarn.resourcemanager.admin.address", 
"10.210.90.101:8141");
        props.setProperty("yarn.resourcemanager.address", "10.210.90.101:8050");
     props.setProperty("yarn.resourcemanager.resource-tracker.address", 
"10.210.90.101:8025");
     props.setProperty("yarn.resourcemanager.scheduler.address", 
"10.210.90.101:8030");
I have added core-site.xml、 yarn-site.xml、。。。。。into eclipse project.
I can run pig script in " grunt> "
but,when I run
pigServer = new PigServer( ExecType.MAPREDUCE, props);
   pigServer.registerQuery("tmp= LOAD '/user/hadoop/aa.txt';");
   pigServer.registerQuery("tmp_table_limit = order tmp by $0;");
   pigServer.store("tmp_table_limit", "/user/hadoop/shi.txt");
I always get error:
14/12/30 17:28:33 WARN hadoop20.PigJobControl: falling back to default 
JobControl (not using hadoop 0.20 ?)
java.lang.NoSuchFieldException: runnerState
 at java.lang.Class.getDeclaredField(Class.java:1948)
 at 
org.apache.pig.backend.hadoop20.PigJobControl.<clinit>(PigJobControl.java:51)
 
 
 
 
 
 
 
        help!!!!!!

Reply via email to