I have done it.thank you very much Daniel.
pigServer.registerJar("/root/workspace/hive-hcatalog-core-0.12.0-cdh5.0.2.jar");






At 2015-03-05 01:12:07, "Daniel Dai" <da...@hortonworks.com> wrote:
>The easiest way is using “register” statement in pig:
>
>register /localpathto/hcatalog-core.jar
>……
>
>Daniel
>
>On 3/4/15, 1:18 AM, "李运田" <cumt...@163.com> wrote:
>
>>https://issues.apache.org/jira/browse/PIG-2532
>>this error is the same as mine。but I dont know how to added needed jars
>>to eclipse in script.
>>can you help me ?
>>
>> 
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>At 2015-03-04 02:34:05, "Daniel Dai" <da...@hortonworks.com> wrote:
>>>HCatSchema is in hcatalog-core.jar and it is missing in the backend. You
>>>will need to register several jars in pig script:
>>>
>>>hcatalog-core.jar
>>>
>>>hive-metastore.jar
>>>libthrift.jar
>>>hive-core.jar
>>>libfb303.jar
>>>jdo*-api.jar
>>>slf4j-api.jar
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>When you run in command line, bin/pig script does that for you. But in
>>>eclipse, you will need to do it by yourself.
>>>
>>>Daniel
>>>
>>>On 3/3/15, 4:40 AM, "李运田" <cumt...@163.com> wrote:
>>>
>>>>2015-03-03 19:54:43,072 INFO [main]
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for
>>>>application appattempt_1421806758143_0252_000001
>>>>2015-03-03 19:54:43,623 WARN [main]
>>>>org.apache.hadoop.conf.Configuration:
>>>>job.xml:an attempt to override final parameter:
>>>>mapreduce.job.end-notification.max.retry.interval;  Ignoring.
>>>>2015-03-03 19:54:43,650 WARN [main]
>>>>org.apache.hadoop.conf.Configuration:
>>>>job.xml:an attempt to override final parameter:
>>>>mapreduce.job.end-notification.max.attempts;  Ignoring.
>>>>2015-03-03 19:54:43,778 WARN [main]
>>>>org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop
>>>>library for your platform... using builtin-java classes where applicable
>>>>2015-03-03 19:54:43,798 INFO [main]
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens:
>>>>2015-03-03 19:54:43,798 INFO [main]
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: YARN_AM_RM_TOKEN,
>>>>Service: , Ident:
>>>>(org.apache.hadoop.yarn.security.AMRMTokenIdentifier@307c9a57)
>>>>2015-03-03 19:54:43,833 INFO [main]
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster: The specific max
>>>>attempts: 2 for application: 252. Attempt num: 1 is last retry: false
>>>>2015-03-03 19:54:43,841 INFO [main]
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred
>>>>newApiCommitter.
>>>>2015-03-03 19:54:43,985 WARN [main]
>>>>org.apache.hadoop.conf.Configuration:
>>>>job.xml:an attempt to override final parameter:
>>>>mapreduce.job.end-notification.max.retry.interval;  Ignoring.
>>>>2015-03-03 19:54:43,999 WARN [main]
>>>>org.apache.hadoop.conf.Configuration:
>>>>job.xml:an attempt to override final parameter:
>>>>mapreduce.job.end-notification.max.attempts;  Ignoring.
>>>>2015-03-03 19:54:44,535 WARN [main]
>>>>org.apache.hadoop.hdfs.BlockReaderLocal: The short-circuit local reads
>>>>feature cannot be used because libhadoop cannot be loaded.
>>>>2015-03-03 19:54:44,719 INFO [main]
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in
>>>>config null
>>>>2015-03-03 19:54:44,844 INFO [main]
>>>>org.apache.hadoop.service.AbstractService: Service
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED;
>>>>cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
>>>>java.io.IOException: Deserialization error:
>>>>org.apache.hcatalog.data.schema.HCatSchema
>>>>org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
>>>>java.io.IOException: Deserialization error:
>>>>org.apache.hcatalog.data.schema.HCatSchema
>>>>    at 
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRA
>>>>pp
>>>>Master.java:473)
>>>>    at 
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.j
>>>>av
>>>>a:374)
>>>>    at 
>>>>org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
>>>>    at 
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.run(MRAppMaster.java:14
>>>>56
>>>>)
>>>>    at java.security.AccessController.doPrivileged(Native Method)
>>>>    at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>    at 
>>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio
>>>>n.
>>>>java:1548)
>>>>    at 
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRA
>>>>pp
>>>>Master.java:1453)
>>>>    at 
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:138
>>>>6)
>>>>Caused by: java.io.IOException: Deserialization error:
>>>>org.apache.hcatalog.data.schema.HCatSchema
>>>>    at 
>>>>org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.j
>>>>av
>>>>a:59)
>>>>    at org.apache.pig.impl.util.UDFContext.deserialize(UDFContext.java:192)
>>>>    at 
>>>>org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil.setupUDFCo
>>>>nt
>>>>ext(MapRedUtil.java:173)
>>>>    at 
>>>>org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFo
>>>>rm
>>>>at.setupUdfEnvAndStores(PigOutputFormat.java:229)
>>>>    at 
>>>>org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFo
>>>>rm
>>>>at.getOutputCommitter(PigOutputFormat.java:275)
>>>>    at 
>>>>org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRA
>>>>pp
>>>>Master.java:471)
>>>>    ... 8 more
>>>>Caused by: java.lang.ClassNotFoundException:
>>>>org.apache.hcatalog.data.schema.HCatSchema
>>>>    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>    at java.security.AccessController.doPrivileged(Native Method)
>>>>    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>>    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>    at java.lang.Class.forName0(Native Method)
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>在 2015-02-28 09:37:29,"李运田" <cumt...@163.com> 写道:
>>>>
>>>>pigServer.registerQuery("tmp = load 'pig' using
>>>>org.apache.hcatalog.pig.HCatLoader();");
>>>>pigServer.registerQuery("tmp = foreach tmp generate id;");
>>>>when I execute this,I cant get  any error
>>>>but,When I execute
>>>>pigServer.registerQuery("store tmp into 'hive' using
>>>>org.apache.hcatalog.pig.HCatStorer();");
>>>>I get error like this:::
>>>>org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable
>>>>to
>>>>open iterator for alias tmp
>>>> at org.apache.pig.PigServer.openIterator(PigServer.java:880)
>>>> at org.gradle.PigHiveHCat.run(PigHiveHCat.java:68)
>>>> at org.gradle.PigHiveHCat.main(PigHiveHCat.java:28)
>>>>Caused by: java.io.IOException: Job terminated with anomalous status
>>>>FAILED
>>>> at org.apache.pig.PigServer.openIterator(PigServer.java:872)
>>>> ... 2 more
>>>>........................................................................
>>>>..
>>>>.................................
>>>>pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
>>>>pigServer.store("a", "/user/hadoop/pig1.txt");
>>>>pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
>>>>........................................this is OK,so I think something
>>>>about Hcatalog is wrong...........................
>>>>
>>>>
>>>>At 2015-02-28 01:21:12, "Alan Gates" <alanfga...@gmail.com> wrote:
>>>>What error message are you getting?
>>>>
>>>>Alan.
>>>>
>>>>
>>>>李运田
>>>>February 26, 2015 at 18:58
>>>>I want to use hcatalog in eclipse to deal with tables in hive.
>>>>but I cant store table into hive::
>>>>pigServer.registerQuery("tmp = load 'pig' using
>>>>org.apache.hcatalog.pig.HCatLoader();");
>>>>pigServer.registerQuery("tmp = foreach tmp generate id;");
>>>>pigServer.registerQuery("store tmp into 'hive' using
>>>>org.apache.hcatalog.pig.HCatStorer();");
>>>>I can store into file::
>>>>pigServer.registerQuery("a = LOAD '/user/hadoop/pig.txt' ;");
>>>>pigServer.store("a", "/user/hadoop/pig1.txt");
>>>>pigServer.registerQuery("store a into '/user/hadoop/pig2.txt';");
>>>>perhaps ,the hcatalog jars are wrong?
>>>>
>>>>
>>>
>

Reply via email to