Hi Durga,

Can you share the errors you get when giving the LOAD the way i specified
above. Also, can you confirm if the phoenix table is ndm_17.table1 and not
NDM_17.TABLE1 ?

Regards
Ravi

On Thu, Jul 9, 2015 at 10:09 AM, Ns G <nsgns...@gmail.com> wrote:

> Hi Ravi Kiran,
>
> Thanks for your response. Yes I have tried the way you have suggested
> already and still got failure message.
>
> We have 3 zookeeper and I have provided one in place of region server in
> the above example and the null value  is what I got.
>
> Is there any other way I can test?
>
> Ps: I am able to load data using same jar and quorum.
>
> Thanks,
>  On 09-Jul-2015 10:27 pm, "Ravi Kiran" <maghamraviki...@gmail.com> wrote:
>
>> Hi Durga,
>>    Please refer the table name as below in the LOAD.
>>
>> s = load 'hbase://table/ndm_17.table1/' using org.apache.phoenix.pig.
>> PhoenixHBaseLoader('regionserver');
>>
>> I am also hoping the 'regionserver' mentioned above is the zookeeper
>> quorum.
>>
>> Regards
>> Ravi
>>
>> On Thu, Jul 9, 2015 at 6:35 AM, Ns G <nsgns...@gmail.com> wrote:
>>
>>> Hi there,
>>>
>>> I am trying to read a phoenix table through pig but i am facing error
>>> and unable to read. we are using Cloudera version of phoenix (4.3.1)
>>>
>>> I have given command like
>>> register /home/satya/phoenix-4.3.1-client.jar;
>>>
>>> and then the PIG Command:
>>>
>>> s = load 'hbase://ndm_17.table1/' using
>>> org.apache.phoenix.pig.PhoenixHBaseLoader('regionserver');
>>>
>>> Below is the error i am getting:
>>>
>>>  ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1200: null
>>>
>>> ERROR 1200: null
>>>
>>> Failed to parse: null
>>>     at
>>> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:198)
>>>     at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1660)
>>>     at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1633)
>>>     at org.apache.pig.PigServer.registerQuery(PigServer.java:587)
>>>     at
>>> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1093)
>>>     at
>>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:501)
>>>     at
>>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
>>>     at
>>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
>>>     at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
>>>     at org.apache.pig.Main.run(Main.java:541)
>>>     at org.apache.pig.Main.main(Main.java:156)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>     at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>> Caused by: java.lang.NullPointerException
>>>     at
>>> org.apache.phoenix.pig.PhoenixHBaseLoader.initializePhoenixPigConfiguration(PhoenixHBaseLoader.java:146)
>>>     at
>>> org.apache.phoenix.pig.PhoenixHBaseLoader.getSchema(PhoenixHBaseLoader.java:229)
>>>     at
>>> org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175)
>>>     at
>>> org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89)
>>>     at
>>> org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:853)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3568)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1625)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
>>>     at
>>> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:188)
>>>     ... 16 more
>>>
>>>
>>>
>>>
>>> Some times i got this error (not sure when )
>>>
>>> Pig Stack Trace
>>> ---------------
>>> ERROR 2245: Cannot get schema from loadFunc
>>> org.apache.phoenix.pig.PhoenixHBaseLoader
>>>
>>> Failed to parse: Can not retrieve schema from loader
>>> org.apache.phoenix.pig.PhoenixHBaseLoader@69dfe150
>>>     at
>>> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:198)
>>>     at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1660)
>>>     at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1633)
>>>     at org.apache.pig.PigServer.registerQuery(PigServer.java:587)
>>>     at
>>> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1093)
>>>     at
>>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:501)
>>>     at
>>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
>>>     at
>>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
>>>     at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
>>>     at org.apache.pig.Main.run(Main.java:541)
>>>     at org.apache.pig.Main.main(Main.java:156)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>     at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>> Caused by: java.lang.RuntimeException: Can not retrieve schema from
>>> loader org.apache.phoenix.pig.PhoenixHBaseLoader@69dfe150
>>>     at
>>> org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:91)
>>>     at
>>> org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:853)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3568)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1625)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560)
>>>     at
>>> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
>>>     at
>>> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:188)
>>>     ... 16 more
>>> Caused by: org.apache.pig.impl.logicalLayer.FrontendException: ERROR
>>> 2245: Cannot get schema from loadFunc
>>> org.apache.phoenix.pig.PhoenixHBaseLoader
>>>     at
>>> org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:179)
>>>     at
>>> org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89)
>>>     ... 23 more
>>> Caused by: java.io.IOException: java.sql.SQLException: ERROR 103
>>> (08004): Unable to establish connection.
>>>     at
>>> org.apache.phoenix.pig.util.PhoenixPigSchemaUtil.getResourceSchema(PhoenixPigSchemaUtil.java:77)
>>>     at
>>> org.apache.phoenix.pig.PhoenixHBaseLoader.getSchema(PhoenixHBaseLoader.java:230)
>>>     at
>>> org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175)
>>>     ... 24 more
>>> Caused by: java.sql.SQLException: ERROR 103 (08004): Unable to establish
>>> connection.
>>>     at
>>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:362)
>>>     at
>>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
>>>     at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:282)
>>>     at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:166)
>>>     at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$11.call(ConnectionQueryServicesImpl.java:1831)
>>>     at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$11.call(ConnectionQueryServicesImpl.java:1810)
>>>     at
>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
>>>     at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1810)
>>>     at
>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:162)
>>>     at
>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:126)
>>>     at
>>> org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:133)
>>>     at java.sql.DriverManager.getConnection(DriverManager.java:571)
>>>     at java.sql.DriverManager.getConnection(DriverManager.java:187)
>>>     at
>>> org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:54)
>>>     at
>>> org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:227)
>>>     at
>>> org.apache.phoenix.pig.util.PhoenixPigSchemaUtil.getResourceSchema(PhoenixPigSchemaUtil.java:62)
>>>     ... 26 more
>>> Caused by: java.io.IOException:
>>> java.lang.reflect.InvocationTargetException
>>>     at
>>> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
>>>     at
>>> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:414)
>>>     at
>>> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:323)
>>>     at
>>> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
>>>     at
>>> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
>>>     at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:280)
>>>     ... 39 more
>>> Caused by: java.lang.reflect.InvocationTargetException
>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>>     at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>>     at
>>> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
>>>     ... 44 more
>>> Caused by: java.lang.UnsupportedOperationException: Unable to find
>>> org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
>>>     at
>>> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
>>>     at
>>> org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
>>>     at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2317)
>>>     at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:688)
>>>     at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
>>>     ... 49 more
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>     at java.lang.Class.forName0(Native Method)
>>>     at java.lang.Class.forName(Class.java:190)
>>>     at
>>> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
>>>     ... 53 more
>>>
>>>
>>>
>>> Thanks a lot in advance,
>>> Durga Prasad
>>>
>>
>>

Reply via email to