Re: query uses WITH blocks and throws exception if run as Oozie hive action (hive-0.13.1)

2015-05-17 Thread Harsh J
Your question should be directed to the u...@hive.apache.org lists.

On Sat, May 16, 2015 at 4:51 AM, Alexander Pivovarov
apivova...@gmail.com wrote:
 Looks like I found it
 https://issues.apache.org/jira/browse/HIVE-9409

 public class UDTFOperator
 ...

 -  protected final Log LOG = LogFactory.getLog(this.getClass().getName());
 +  protected static final Log LOG =
 LogFactory.getLog(UDTFOperator.class.getName());



 On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov apivova...@gmail.com
 wrote:

 I also noticed another error message in logs

 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
 Status: Failed
 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
 Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
 diagnostics=[Vertex Input: dual initializer failed.,
 org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
 class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
 Serialization trace:
 LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
 childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
 childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
 aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]

 one of the WITH blocks had explode() UDTF
 I replaced it with select ... union all select ... union all select ...
 and query is working fine now.

 Do you know anything about UDTF and Kryo issues fixed after 0.13.1?


 On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov
 apivova...@gmail.com wrote:

 Looks like it was fixed in hive-0.14
 https://issues.apache.org/jira/browse/HIVE-7079

 On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov
 apivova...@gmail.com wrote:

 Hi Everyone

 I'm using hive-0.13.1   (HDP-2.1.5) and getting the following stacktrace
 if run my query (which has WITH block) via Oozie.   (BTW, the query works
 fine in CLI)

 I can't put exact query but the structure is similar to

 create table my_consumer
 as
 with sacusaloan as (select distinct e,f,g from E)

 select A.a, A.b, A.c,
   if(sacusaloan.id is null, 0, 1) as sacusaloan_status
 from (select a,b,c from A) A
 left join sacusaloan on (...)

 8799 [main] INFO  hive.ql.parse.ParseDriver  - Parse Completed
 8799 [main] INFO  org.apache.hadoop.hive.ql.log.PerfLogger  - /PERFLOG
 method=parse start=1431723485500 end=1431723485602 duration=102
 from=org.apache.hadoop.hive.ql.Driver
 8799 [main] INFO  org.apache.hadoop.hive.ql.log.PerfLogger  - PERFLOG
 method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver
 8834 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  -
 Starting Semantic Analysis
 8837 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  -
 Creating table wk_qualified_outsource_loan_consumer position=13
 8861 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  -
 Completed phase 1 of Semantic Analysis
 8861 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  -
 Get metadata for source tables
 8865 [main] ERROR hive.ql.metadata.Hive  -
 NoSuchObjectException(message:default.sacusaloan table not found)
at
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
at
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
at
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
at
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
at
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
at com.sun.proxy.$Proxy18.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
at
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
at
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
at
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
at
 org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)

Re: query uses WITH blocks and throws exception if run as Oozie hive action (hive-0.13.1)

2015-05-15 Thread Alexander Pivovarov
I also noticed another error message in logs

10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Status: Failed
10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
diagnostics=[Vertex Input: dual initializer failed.,
org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
Serialization trace:
LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]

one of the WITH blocks had explode() UDTF
I replaced it with select ... union all select ... union all select ...
and query is working fine now.

Do you know anything about UDTF and Kryo issues fixed after 0.13.1?


On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov apivova...@gmail.com
wrote:

 Looks like it was fixed in hive-0.14
 https://issues.apache.org/jira/browse/HIVE-7079

 On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov apivova...@gmail.com
  wrote:

 Hi Everyone

 I'm using hive-0.13.1   (HDP-2.1.5) and getting the following stacktrace
 if run my query (which has WITH block) via Oozie.   (BTW, the query works
 fine in CLI)

 I can't put exact query but the structure is similar to

 create table my_consumer
 as
 with sacusaloan as (select distinct e,f,g from E)

 select A.a, A.b, A.c,
   if(sacusaloan.id is null, 0, 1) as sacusaloan_status
 from (select a,b,c from A) A
 left join sacusaloan on (...)

 8799 [main] INFO  hive.ql.parse.ParseDriver  - Parse Completed
 8799 [main] INFO  org.apache.hadoop.hive.ql.log.PerfLogger  - /PERFLOG 
 method=parse start=1431723485500 end=1431723485602 duration=102 
 from=org.apache.hadoop.hive.ql.Driver
 8799 [main] INFO  org.apache.hadoop.hive.ql.log.PerfLogger  - PERFLOG 
 method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver
 8834 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Starting Semantic Analysis
 8837 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Creating table wk_qualified_outsource_loan_consumer position=13
 8861 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Completed phase 1 of Semantic Analysis
 8861 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - Get 
 metadata for source tables
 8865 [main] ERROR hive.ql.metadata.Hive  - 
 NoSuchObjectException(message:default.sacusaloan table not found)
  at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
  at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
  at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
  at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
  at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
  at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
  at 
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
  at 
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
  at com.sun.proxy.$Proxy18.getTable(Unknown Source)
  at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
  at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
  at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
  at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
  at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
  at 
 org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
  at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
  at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
  at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
  at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
  at 
 org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
  at 

Re: query uses WITH blocks and throws exception if run as Oozie hive action (hive-0.13.1)

2015-05-15 Thread Alexander Pivovarov
Looks like I found it
https://issues.apache.org/jira/browse/HIVE-9409

public class UDTFOperator
...

-  protected final Log LOG = LogFactory.getLog(this.getClass().getName());
+  protected static final Log LOG =
LogFactory.getLog(UDTFOperator.class.getName());



On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov apivova...@gmail.com
wrote:

 I also noticed another error message in logs

 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
 Status: Failed
 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
 Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
 diagnostics=[Vertex Input: dual initializer failed.,
 org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
 class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
 Serialization trace:
 LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
 childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
 childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
 aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]

 one of the WITH blocks had explode() UDTF
 I replaced it with select ... union all select ... union all select ...
 and query is working fine now.

 Do you know anything about UDTF and Kryo issues fixed after 0.13.1?


 On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov apivova...@gmail.com
  wrote:

 Looks like it was fixed in hive-0.14
 https://issues.apache.org/jira/browse/HIVE-7079

 On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov 
 apivova...@gmail.com wrote:

 Hi Everyone

 I'm using hive-0.13.1   (HDP-2.1.5) and getting the following stacktrace
 if run my query (which has WITH block) via Oozie.   (BTW, the query works
 fine in CLI)

 I can't put exact query but the structure is similar to

 create table my_consumer
 as
 with sacusaloan as (select distinct e,f,g from E)

 select A.a, A.b, A.c,
   if(sacusaloan.id is null, 0, 1) as sacusaloan_status
 from (select a,b,c from A) A
 left join sacusaloan on (...)

 8799 [main] INFO  hive.ql.parse.ParseDriver  - Parse Completed
 8799 [main] INFO  org.apache.hadoop.hive.ql.log.PerfLogger  - /PERFLOG 
 method=parse start=1431723485500 end=1431723485602 duration=102 
 from=org.apache.hadoop.hive.ql.Driver
 8799 [main] INFO  org.apache.hadoop.hive.ql.log.PerfLogger  - PERFLOG 
 method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver
 8834 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Starting Semantic Analysis
 8837 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Creating table wk_qualified_outsource_loan_consumer position=13
 8861 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Completed phase 1 of Semantic Analysis
 8861 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - Get 
 metadata for source tables
 8865 [main] ERROR hive.ql.metadata.Hive  - 
 NoSuchObjectException(message:default.sacusaloan table not found)
 at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
 at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
 at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
 at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
 at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
 at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
 at 
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at 
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
 at com.sun.proxy.$Proxy18.getTable(Unknown Source)
 at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
 at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
 at 
 org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
 at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
 at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
 at 

Re: query uses WITH blocks and throws exception if run as Oozie hive action (hive-0.13.1)

2015-05-15 Thread Alexander Pivovarov
Looks like it was fixed in hive-0.14
https://issues.apache.org/jira/browse/HIVE-7079

On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov apivova...@gmail.com
wrote:

 Hi Everyone

 I'm using hive-0.13.1   (HDP-2.1.5) and getting the following stacktrace
 if run my query (which has WITH block) via Oozie.   (BTW, the query works
 fine in CLI)

 I can't put exact query but the structure is similar to

 create table my_consumer
 as
 with sacusaloan as (select distinct e,f,g from E)

 select A.a, A.b, A.c,
   if(sacusaloan.id is null, 0, 1) as sacusaloan_status
 from (select a,b,c from A) A
 left join sacusaloan on (...)

 8799 [main] INFO  hive.ql.parse.ParseDriver  - Parse Completed
 8799 [main] INFO  org.apache.hadoop.hive.ql.log.PerfLogger  - /PERFLOG 
 method=parse start=1431723485500 end=1431723485602 duration=102 
 from=org.apache.hadoop.hive.ql.Driver
 8799 [main] INFO  org.apache.hadoop.hive.ql.log.PerfLogger  - PERFLOG 
 method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver
 8834 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Starting Semantic Analysis
 8837 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Creating table wk_qualified_outsource_loan_consumer position=13
 8861 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - 
 Completed phase 1 of Semantic Analysis
 8861 [main] INFO  org.apache.hadoop.hive.ql.parse.SemanticAnalyzer  - Get 
 metadata for source tables
 8865 [main] ERROR hive.ql.metadata.Hive  - 
 NoSuchObjectException(message:default.sacusaloan table not found)
   at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
   at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
   at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
   at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
   at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
   at 
 org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
   at 
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:606)
   at 
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
   at com.sun.proxy.$Proxy18.getTable(Unknown Source)
   at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
   at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
   at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
   at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
   at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
   at 
 org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
   at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
   at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
   at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
   at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
   at 
 org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
   at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
   at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
   at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
   at 
 org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
   at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
   at 
 org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
   at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
   at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
   at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
   at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
   at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
   at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at