[jira] [Commented] (HIVE-17002) decimal (binary) is not working when creating external table for hbase

2017-12-12 Thread Artur Tamazian (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-17002?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16288046#comment-16288046
 ] 

Artur Tamazian commented on HIVE-17002:
---

I don't have time right now, but I looked at your patch and it's almost exactly 
the same I ended up doing for our installation.
Only difference is in LazyDioHiveDecimal::init I did data field initialization 
like this:

{code}
data = new 
HiveDecimalWritable(HiveDecimal.create(Bytes.toBigDecimal(bytes.getData(), 
start, length)));
{code}

> decimal (binary) is not working when creating external table for hbase
> --
>
> Key: HIVE-17002
> URL: https://issues.apache.org/jira/browse/HIVE-17002
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 2.1.1
> Environment: HBase 1.2.0, Hive 2.1.1
>Reporter: Artur Tamazian
>Assignee: Naveen Gangam
>
> I have a table in Hbase which has a column stored using 
> Bytes.toBytes((BigDecimal) value). Hbase version is 1.2.0
> I'm creating an external table in hive to access it like this:
> {noformat}
> create external table `Users`(key int, ..., `example_column` decimal) 
> stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' 
> with serdeproperties ("hbase.columns.mapping" = ":key, 
> db:example_column") 
> tblproperties("hbase.table.name" = 
> "Users","hbase.table.default.storage.type" = "binary");
> {noformat}
> Table is created without errors. After that I try running "select * from 
> users;" and see this error:
> {noformat}
> org.apache.hive.service.cli.HiveSQLException:java.io.IOException: 
> java.lang.RuntimeException: java.lang.RuntimeException: Hive Internal Error: 
> no LazyObject for 
> org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyHiveDecimalObjectInspector@1f18cebb:25:24
>   
>
> org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:484
>   
>
> org.apache.hive.service.cli.operation.OperationManager:getOperationNextRowSet:OperationManager.java:308
>   
>
> org.apache.hive.service.cli.session.HiveSessionImpl:fetchResults:HiveSessionImpl.java:847
>   
>sun.reflect.GeneratedMethodAccessor11:invoke::-1  
>
> sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43
>   
>java.lang.reflect.Method:invoke:Method.java:498  
>
> org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78
>   
>
> org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36
>   
>
> org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63
>   
>java.security.AccessController:doPrivileged:AccessController.java:-2  
>javax.security.auth.Subject:doAs:Subject.java:422  
>
> org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1698
>   
>
> org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59
>   
>com.sun.proxy.$Proxy33:fetchResults::-1  
>org.apache.hive.service.cli.CLIService:fetchResults:CLIService.java:504  
>
> org.apache.hive.service.cli.thrift.ThriftCLIService:FetchResults:ThriftCLIService.java:698
>   
>
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1717
>   
>
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1702
>   
>org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39  
>org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39  
>
> org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56
>   
>
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286
>   
>
> java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1142
>   
>
> java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:617
>   
>java.lang.Thread:run:Thread.java:748  
>*java.io.IOException:java.lang.RuntimeException: 
> java.lang.RuntimeException: Hive Internal Error: no LazyObject for 
> org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyHiveDecimalObjectInspector@1f18cebb:27:2
>   
>org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:164  
>org.apache.hadoop.hive.ql.Driver:getResults:Driver.java:2098  
>
> org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:479
>   
>*java.lang.RuntimeException:java.lang.RuntimeException: Hive Internal 
> Error: no LazyObject for 
> org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyHiveDecimalObjectInspector@1f18cebb:43:16
>   
>
> org.apache.hadoop.hive.serde2.lazy.LazyStruct:initLazyFields:LazyStruct.java:172
>   
>

[jira] [Commented] (HIVE-15883) HBase mapped table in Hive insert fail for decimal

2017-07-03 Thread Artur Tamazian (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-15883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16072145#comment-16072145
 ] 

Artur Tamazian commented on HIVE-15883:
---

Possibly related issue: https://issues.apache.org/jira/browse/HIVE-17002

> HBase mapped table in Hive insert fail for decimal
> --
>
> Key: HIVE-15883
> URL: https://issues.apache.org/jira/browse/HIVE-15883
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Affects Versions: 1.1.0
>Reporter: Naveen Gangam
>Assignee: Naveen Gangam
> Attachments: HIVE-15883.patch
>
>
> CREATE TABLE hbase_table (
> id int,
> balance decimal(15,2))
> ROW FORMAT DELIMITED
> COLLECTION ITEMS TERMINATED BY '~'
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> WITH SERDEPROPERTIES (
> "hbase.columns.mapping"=":key,cf:balance#b");
> insert into hbase_table values (1,1);
> 
> Diagnostic Messages for this Task:
> Error: java.lang.RuntimeException: 
> org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while 
> processing row {"tmp_values_col1":"1","tmp_values_col2":"1"}
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:179)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1783)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime 
> Error while processing row {"tmp_values_col1":"1","tmp_values_col2":"1"}
> at 
> org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:507)
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:170)
> ... 8 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
> org.apache.hadoop.hive.serde2.SerDeException: java.lang.RuntimeException: 
> Hive internal error.
> at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:733)
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
> at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
> at 
> org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:97)
> at 
> org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157)
> at 
> org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:497)
> ... 9 more
> Caused by: org.apache.hadoop.hive.serde2.SerDeException: 
> java.lang.RuntimeException: Hive internal error.
> at 
> org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:286)
> at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:668)
> ... 15 more
> Caused by: java.lang.RuntimeException: Hive internal error.
> at 
> org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitive(LazyUtils.java:328)
> at 
> org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:220)
> at 
> org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeField(HBaseRowSerializer.java:194)
> at 
> org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:118)
> at 
> org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:282)
> ... 16 more 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (HIVE-17002) decimal (binary) is not working when creating external table for hbase

2017-06-30 Thread Artur Tamazian (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-17002?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Artur Tamazian updated HIVE-17002:
--
Description: 
I have a table in Hbase which has a column stored using 
Bytes.toBytes((BigDecimal) value). Hbase version is 1.2.0

I'm creating an external table in hive to access it like this:

{noformat}
create external table `Users`(key int, ..., `example_column` decimal) 
stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' 
with serdeproperties ("hbase.columns.mapping" = ":key, db:example_column") 
tblproperties("hbase.table.name" = 
"Users","hbase.table.default.storage.type" = "binary");
{noformat}

Table is created without errors. After that I try running "select * from 
users;" and see this error:

{noformat}
org.apache.hive.service.cli.HiveSQLException:java.io.IOException: 
java.lang.RuntimeException: java.lang.RuntimeException: Hive Internal Error: no 
LazyObject for 
org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyHiveDecimalObjectInspector@1f18cebb:25:24
  
   
org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:484
  
   
org.apache.hive.service.cli.operation.OperationManager:getOperationNextRowSet:OperationManager.java:308
  
   
org.apache.hive.service.cli.session.HiveSessionImpl:fetchResults:HiveSessionImpl.java:847
  
   sun.reflect.GeneratedMethodAccessor11:invoke::-1  
   
sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43
  
   java.lang.reflect.Method:invoke:Method.java:498  
   
org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78
  
   
org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36
  
   
org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63
  
   java.security.AccessController:doPrivileged:AccessController.java:-2  
   javax.security.auth.Subject:doAs:Subject.java:422  
   
org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1698
  
   
org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59
  
   com.sun.proxy.$Proxy33:fetchResults::-1  
   org.apache.hive.service.cli.CLIService:fetchResults:CLIService.java:504  
   
org.apache.hive.service.cli.thrift.ThriftCLIService:FetchResults:ThriftCLIService.java:698
  
   
org.apache.hive.service.rpc.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1717
  
   
org.apache.hive.service.rpc.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1702
  
   org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39  
   org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39  
   
org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56
  
   
org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286
  
   
java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1142  
   
java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:617  
   java.lang.Thread:run:Thread.java:748  
   *java.io.IOException:java.lang.RuntimeException: java.lang.RuntimeException: 
Hive Internal Error: no LazyObject for 
org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyHiveDecimalObjectInspector@1f18cebb:27:2
  
   org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:164  
   org.apache.hadoop.hive.ql.Driver:getResults:Driver.java:2098  
   
org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:479
  
   *java.lang.RuntimeException:java.lang.RuntimeException: Hive Internal Error: 
no LazyObject for 
org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyHiveDecimalObjectInspector@1f18cebb:43:16
  
   
org.apache.hadoop.hive.serde2.lazy.LazyStruct:initLazyFields:LazyStruct.java:172
  
   org.apache.hadoop.hive.hbase.LazyHBaseRow:initFields:LazyHBaseRow.java:122  
   org.apache.hadoop.hive.hbase.LazyHBaseRow:getField:LazyHBaseRow.java:116  
   
org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector:getStructFieldData:LazySimpleStructObjectInspector.java:128
  
   
org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator:_evaluate:ExprNodeColumnEvaluator.java:94
  
   
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator:evaluate:ExprNodeEvaluator.java:77
  
   
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject:get:ExprNodeGenericFuncEvaluator.java:87
  
   
org.apache.hadoop.hive.ql.udf.generic.GenericUDFOPEqual:evaluate:GenericUDFOPEqual.java:103
  
   
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator:_evaluate:ExprNodeGenericFuncEvaluator.java:186
  
   
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator:evaluate:ExprNodeEvaluator.java:77
  
   
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator:evaluate:ExprNodeEvaluator.java:65