It should work. Seems metastore client is still trying to connect to metastore 
server, but it shouldn’t. The logic for HiveMetaStoreClient is:

if (conf.getVar(“hive.metastore.uris”)==null) {


  // embedded metastore
} else {
  ….
  Open();
}

It must see some “hive.metastore.uris” from some other config file.

Thanks,
Daniel

On 7/12/16, 3:52 AM, "Oleksiy S" <osayankin.superu...@gmail.com> wrote:

>Hi all!
>
>Is it possible to use Pig and Hive with embedded metastore (no
>hive.metastore.uris in hive-site.xml)? This is my steps:
>
>1. Configure Hive
>
><configuration>
>
><!-- MYSQL -->
>
><property>
>  <name>javax.jdo.option.ConnectionURL</name>
>
><value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
>  <description>the URL of the MySQL database</description>
></property>
>
><property>
>  <name>javax.jdo.option.ConnectionDriverName</name>
>  <value>com.mysql.jdbc.Driver</value>
></property>
>
><property>
>  <name>javax.jdo.option.ConnectionUserName</name>
>  <value>hive</value>
></property>
>
><property>
>  <name>javax.jdo.option.ConnectionPassword</name>
>  <value>secret</value>
></property>
>
><property>
>  <name>datanucleus.autoCreateSchema</name>
>  <value>false</value>
></property>
>
><property>
>  <name>datanucleus.fixedDatastore</name>
>  <value>true</value>
></property>
>
></configuration>
>
>2. Enter to hive-shell and create new table:
>
>hive> CREATE TABLE bug (id INT);
>
>3. Start pig
>
>pig -useHCatalog
>
>
>4. Load data from Hive
>
>grunt> A = load 'bug' USING org.apache.hive.hcatalog.pig.HCatLoader();
>
>
>exception:
>
>ERROR 2245: Cannot get schema from loadFunc
>org.apache.hive.hcatalog.pig.HCatLoader
>
>Failed to parse: Can not retrieve schema from loader
>org.apache.hive.hcatalog.pig.HCatLoader@27755284
>    at
>org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:201)
>    at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1707)
>    at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1680)
>    at org.apache.pig.PigServer.registerQuery(PigServer.java:623)
>    at
>org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1082)
>    at
>org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:505)
>    at
>org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230)
>    at
>org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
>    at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
>    at org.apache.pig.Main.run(Main.java:566)
>    at org.apache.pig.Main.main(Main.java:177)
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>    at
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    at java.lang.reflect.Method.invoke(Method.java:606)
>    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>Caused by: java.lang.RuntimeException: Can not retrieve schema from loader
>org.apache.hive.hcatalog.pig.HCatLoader@27755284
>    at
>org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:91)
>    at
>org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:901)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3568)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1625)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
>    at
>org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:191)
>    ... 16 more
>Caused by: org.apache.pig.impl.logicalLayer.FrontendException: ERROR 2245:
>Cannot get schema from loadFunc org.apache.hive.hcatalog.pig.HCatLoader
>    at
>org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:179)
>    at
>org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89)
>    ... 23 more
>Caused by: java.io.IOException: java.lang.Exception: Could not instantiate
>a HiveMetaStoreClient connecting to server uri:[null]
>    at
>org.apache.hive.hcatalog.pig.PigHCatUtil.getTable(PigHCatUtil.java:211)
>    at
>org.apache.hive.hcatalog.pig.HCatLoader.getSchema(HCatLoader.java:217)
>    at
>org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175)
>    ... 24 more
>Caused by: java.lang.Exception: Could not instantiate a HiveMetaStoreClient
>connecting to server uri:[null]
>    at
>org.apache.hive.hcatalog.pig.PigHCatUtil.getHiveMetaClient(PigHCatUtil.java:166)
>    at
>org.apache.hive.hcatalog.pig.PigHCatUtil.getTable(PigHCatUtil.java:206)
>    ... 26 more
>Caused by: com.google.common.util.concurrent.UncheckedExecutionException:
>java.lang.RuntimeException: Unable to instantiate
>org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
>    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2258)
>    at com.google.common.cache.LocalCache.get(LocalCache.java:3990)
>    at
>com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4793)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:227)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:202)
>    at
>org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
>    at
>org.apache.hive.hcatalog.pig.PigHCatUtil.getHiveMetaClient(PigHCatUtil.java:164)
>    ... 27 more
>Caused by: java.lang.RuntimeException: Unable to instantiate
>org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
>    at
>org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
>    at
>org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
>    at
>org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
>    at
>org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:230)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:227)
>    at
>com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4796)
>    at
>com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3589)
>    at
>com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2374)
>    at
>com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2337)
>    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2252)
>    ... 33 more
>Caused by: java.lang.reflect.InvocationTargetException
>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>    at
>sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>    at
>sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>    at
>org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
>    ... 43 more
>Caused by: MetaException(message:Could not connect to meta store using any
>of the URIs provided. Most recent failure:
>org.apache.thrift.transport.TTransportException: java.net.ConnectException:
>Connection refused
>    at org.apache.thrift.transport.TSocket.open(TSocket.java:187)
>    at
>org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
>    at
>org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
>    at
>org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:330)
>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>    at
>sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>    at
>sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>    at
>org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
>    at
>org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
>    at
>org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
>    at
>org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:230)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:227)
>    at
>com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4796)
>    at
>com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3589)
>    at
>com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2374)
>    at
>com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2337)
>    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2252)
>    at com.google.common.cache.LocalCache.get(LocalCache.java:3990)
>    at
>com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4793)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:227)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:202)
>    at
>org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
>    at
>org.apache.hive.hcatalog.pig.PigHCatUtil.getHiveMetaClient(PigHCatUtil.java:164)
>    at
>org.apache.hive.hcatalog.pig.PigHCatUtil.getTable(PigHCatUtil.java:206)
>    at
>org.apache.hive.hcatalog.pig.HCatLoader.getSchema(HCatLoader.java:217)
>    at
>org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175)
>    at
>org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89)
>    at
>org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:901)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3568)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1625)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560)
>    at
>org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
>    at
>org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:191)
>    at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1707)
>    at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1680)
>    at org.apache.pig.PigServer.registerQuery(PigServer.java:623)
>    at
>org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1082)
>    at
>org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:505)
>    at
>org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230)
>    at
>org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
>    at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
>    at org.apache.pig.Main.run(Main.java:566)
>    at org.apache.pig.Main.main(Main.java:177)
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>    at
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    at java.lang.reflect.Method.invoke(Method.java:606)
>    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>Caused by: java.net.ConnectException: Connection refused
>    at java.net.PlainSocketImpl.socketConnect(Native Method)
>    at
>java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
>    at
>java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
>    at
>java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
>    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
>    at java.net.Socket.connect(Socket.java:579)
>    at org.apache.thrift.transport.TSocket.open(TSocket.java:182)
>    ... 52 more
>)
>    at
>org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)
>    at
>org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
>    at
>org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181)
>    at
>org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:330)
>    ... 48 more
>================================================================================
>
>PS: when I set
>
><property>
>  <name>hive.metastore.uris</name>
>  <value>thrift://<host>:9083</value>
>  <description>IP address (or fully-qualified domain name) and port of the
>metastore host</description>
></property>
>
>it works fine but customer does not want that.
>
>
>
>-- 
>Oleksiy

Reply via email to