[jira] [Updated] (HIVE-7473) Null values in DECIMAL columns cause serialization issues with HCatalog

2014-07-27 Thread Navis (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-7473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Navis updated HIVE-7473:


   Resolution: Fixed
Fix Version/s: 0.14.0
   Status: Resolved  (was: Patch Available)

Committed to trunk. Thanks Craig, for the contribution.

 Null values in DECIMAL columns cause serialization issues with HCatalog
 ---

 Key: HIVE-7473
 URL: https://issues.apache.org/jira/browse/HIVE-7473
 Project: Hive
  Issue Type: Bug
  Components: Serializers/Deserializers
Affects Versions: 0.13.1
Reporter: Craig Condit
Assignee: Craig Condit
 Fix For: 0.14.0

 Attachments: HIVE-7473.patch


 WritableHiveDecimalObjectInspector appears to be missing null checks in 
 getPrimitiveWritableObject(Object) and getPrimitiveJavaObject(Object). The 
 same checks do exist in WritableHiveVarcharObjectInspector.
 Attempting to read from a table in HCatalog containing null values for 
 decimal columns results in the following exception (Pig used here):
 {noformat}
 Error: org.apache.pig.backend.executionengine.ExecException: ERROR 6018: 
 Error converting read value to tuple at
   org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:76) 
 at
   org.apache.hive.hcatalog.pig.HCatLoader.getNext(HCatLoader.java:58) at
   
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
  at
   
 org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:532)
  at
   
 org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
  at
   
 org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
  at
   org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at
   org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at
   org.apache.hadoop.mapred.MapTask.run(MapTask.java:339) at
   org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) at
   java.security.AccessController.doPrivileged(Native Method) at
   javax.security.auth.Subject.doAs(Subject.java:415) at
   
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
  at
   org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
 Caused by: java.lang.NullPointerException at 
   
 org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:43)
  at
   
 org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:26)
  at 
   
 org.apache.hive.hcatalog.data.HCatRecordSerDe.serializePrimitiveField(HCatRecordSerDe.java:269)
  at
   
 org.apache.hive.hcatalog.data.HCatRecordSerDe.serializeField(HCatRecordSerDe.java:192)
  at
   org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:53) at
   org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:97) at
   
 org.apache.hive.hcatalog.mapreduce.HCatRecordReader.nextKeyValue(HCatRecordReader.java:204)
  at
   org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:63)
   ... 13 more
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (HIVE-7473) Null values in DECIMAL columns cause serialization issues with HCatalog

2014-07-23 Thread Navis (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-7473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Navis updated HIVE-7473:


Status: Patch Available  (was: Open)

 Null values in DECIMAL columns cause serialization issues with HCatalog
 ---

 Key: HIVE-7473
 URL: https://issues.apache.org/jira/browse/HIVE-7473
 Project: Hive
  Issue Type: Bug
  Components: Serializers/Deserializers
Affects Versions: 0.13.1
Reporter: Craig Condit
Assignee: Craig Condit
 Attachments: HIVE-7473.patch


 WritableHiveDecimalObjectInspector appears to be missing null checks in 
 getPrimitiveWritableObject(Object) and getPrimitiveJavaObject(Object). The 
 same checks do exist in WritableHiveVarcharObjectInspector.
 Attempting to read from a table in HCatalog containing null values for 
 decimal columns results in the following exception (Pig used here):
 {noformat}
 Error: org.apache.pig.backend.executionengine.ExecException: ERROR 6018: 
 Error converting read value to tuple at
   org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:76) 
 at
   org.apache.hive.hcatalog.pig.HCatLoader.getNext(HCatLoader.java:58) at
   
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
  at
   
 org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:532)
  at
   
 org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
  at
   
 org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
  at
   org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at
   org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at
   org.apache.hadoop.mapred.MapTask.run(MapTask.java:339) at
   org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) at
   java.security.AccessController.doPrivileged(Native Method) at
   javax.security.auth.Subject.doAs(Subject.java:415) at
   
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
  at
   org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
 Caused by: java.lang.NullPointerException at 
   
 org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:43)
  at
   
 org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:26)
  at 
   
 org.apache.hive.hcatalog.data.HCatRecordSerDe.serializePrimitiveField(HCatRecordSerDe.java:269)
  at
   
 org.apache.hive.hcatalog.data.HCatRecordSerDe.serializeField(HCatRecordSerDe.java:192)
  at
   org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:53) at
   org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:97) at
   
 org.apache.hive.hcatalog.mapreduce.HCatRecordReader.nextKeyValue(HCatRecordReader.java:204)
  at
   org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:63)
   ... 13 more
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (HIVE-7473) Null values in DECIMAL columns cause serialization issues with HCatalog

2014-07-23 Thread Navis (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-7473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Navis updated HIVE-7473:


Assignee: Craig Condit

 Null values in DECIMAL columns cause serialization issues with HCatalog
 ---

 Key: HIVE-7473
 URL: https://issues.apache.org/jira/browse/HIVE-7473
 Project: Hive
  Issue Type: Bug
  Components: Serializers/Deserializers
Affects Versions: 0.13.1
Reporter: Craig Condit
Assignee: Craig Condit
 Attachments: HIVE-7473.patch


 WritableHiveDecimalObjectInspector appears to be missing null checks in 
 getPrimitiveWritableObject(Object) and getPrimitiveJavaObject(Object). The 
 same checks do exist in WritableHiveVarcharObjectInspector.
 Attempting to read from a table in HCatalog containing null values for 
 decimal columns results in the following exception (Pig used here):
 {noformat}
 Error: org.apache.pig.backend.executionengine.ExecException: ERROR 6018: 
 Error converting read value to tuple at
   org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:76) 
 at
   org.apache.hive.hcatalog.pig.HCatLoader.getNext(HCatLoader.java:58) at
   
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
  at
   
 org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:532)
  at
   
 org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
  at
   
 org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
  at
   org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at
   org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at
   org.apache.hadoop.mapred.MapTask.run(MapTask.java:339) at
   org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) at
   java.security.AccessController.doPrivileged(Native Method) at
   javax.security.auth.Subject.doAs(Subject.java:415) at
   
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
  at
   org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
 Caused by: java.lang.NullPointerException at 
   
 org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:43)
  at
   
 org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:26)
  at 
   
 org.apache.hive.hcatalog.data.HCatRecordSerDe.serializePrimitiveField(HCatRecordSerDe.java:269)
  at
   
 org.apache.hive.hcatalog.data.HCatRecordSerDe.serializeField(HCatRecordSerDe.java:192)
  at
   org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:53) at
   org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:97) at
   
 org.apache.hive.hcatalog.mapreduce.HCatRecordReader.nextKeyValue(HCatRecordReader.java:204)
  at
   org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:63)
   ... 13 more
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (HIVE-7473) Null values in DECIMAL columns cause serialization issues with HCatalog

2014-07-22 Thread Craig Condit (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-7473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Craig Condit updated HIVE-7473:
---

Description: 
WritableHiveDecimalObjectInspector appears to be missing null checks in 
getPrimitiveWritableObject(Object) and getPrimitiveJavaObject(Object). The same 
checks do exist in WritableHiveVarcharObjectInspector.

Attempting to read from a table in HCatalog containing null values for decimal 
columns results in the following exception (Pig used here):

{noformat}
Error: org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error 
converting read value to tuple at
  org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:76) at
  org.apache.hive.hcatalog.pig.HCatLoader.getNext(HCatLoader.java:58) at
  
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
 at
  
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:532)
 at
  
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
 at
  
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
 at
  org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at
  org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at
  org.apache.hadoop.mapred.MapTask.run(MapTask.java:339) at
  org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) at
  java.security.AccessController.doPrivileged(Native Method) at
  javax.security.auth.Subject.doAs(Subject.java:415) at
  
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
 at
  org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.lang.NullPointerException at 
  
org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:43)
 at
  
org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:26)
 at 
  
org.apache.hive.hcatalog.data.HCatRecordSerDe.serializePrimitiveField(HCatRecordSerDe.java:269)
 at
  
org.apache.hive.hcatalog.data.HCatRecordSerDe.serializeField(HCatRecordSerDe.java:192)
 at
  org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:53) at
  org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:97) at
  
org.apache.hive.hcatalog.mapreduce.HCatRecordReader.nextKeyValue(HCatRecordReader.java:204)
 at
  org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:63)
  ... 13 more
{noformat}

  was:
WritableHiveDecimalObjectInspector appears to be missing null checks in 
getPrimitiveWritableObject(Object) and getPrimitiveJavaObject(Object). The same 
checks do exist in WritableHiveVarcharObjectInspector.

Attempting to read from a table in HCatalog containing null values for decimal 
columns results in the following exception (Pig used here):

{noformat}
Error: org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error 
converting read value to tuple at 
org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:76) at 
org.apache.hive.hcatalog.pig.HCatLoader.getNext(HCatLoader.java:58) at 
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
 at 
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:532)
 at 
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
 at 
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at 
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at 
org.apache.hadoop.mapred.MapTask.run(MapTask.java:339) at 
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) at 
java.security.AccessController.doPrivileged(Native Method) at 
javax.security.auth.Subject.doAs(Subject.java:415) at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) Caused by: 
java.lang.NullPointerException at 
org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:43)
 at 
org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:26)
 at 
org.apache.hive.hcatalog.data.HCatRecordSerDe.serializePrimitiveField(HCatRecordSerDe.java:269)
 at 
org.apache.hive.hcatalog.data.HCatRecordSerDe.serializeField(HCatRecordSerDe.java:192)
 at org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:53) at 
org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:97) at 

[jira] [Updated] (HIVE-7473) Null values in DECIMAL columns cause serialization issues with HCatalog

2014-07-22 Thread Craig Condit (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-7473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Craig Condit updated HIVE-7473:
---

Attachment: HIVE-7473.patch

Patch which fixes the issue.

 Null values in DECIMAL columns cause serialization issues with HCatalog
 ---

 Key: HIVE-7473
 URL: https://issues.apache.org/jira/browse/HIVE-7473
 Project: Hive
  Issue Type: Bug
  Components: Serializers/Deserializers
Affects Versions: 0.13.1
Reporter: Craig Condit
 Attachments: HIVE-7473.patch


 WritableHiveDecimalObjectInspector appears to be missing null checks in 
 getPrimitiveWritableObject(Object) and getPrimitiveJavaObject(Object). The 
 same checks do exist in WritableHiveVarcharObjectInspector.
 Attempting to read from a table in HCatalog containing null values for 
 decimal columns results in the following exception (Pig used here):
 {noformat}
 Error: org.apache.pig.backend.executionengine.ExecException: ERROR 6018: 
 Error converting read value to tuple at
   org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:76) 
 at
   org.apache.hive.hcatalog.pig.HCatLoader.getNext(HCatLoader.java:58) at
   
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
  at
   
 org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:532)
  at
   
 org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
  at
   
 org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
  at
   org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at
   org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at
   org.apache.hadoop.mapred.MapTask.run(MapTask.java:339) at
   org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) at
   java.security.AccessController.doPrivileged(Native Method) at
   javax.security.auth.Subject.doAs(Subject.java:415) at
   
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
  at
   org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
 Caused by: java.lang.NullPointerException at 
   
 org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:43)
  at
   
 org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:26)
  at 
   
 org.apache.hive.hcatalog.data.HCatRecordSerDe.serializePrimitiveField(HCatRecordSerDe.java:269)
  at
   
 org.apache.hive.hcatalog.data.HCatRecordSerDe.serializeField(HCatRecordSerDe.java:192)
  at
   org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:53) at
   org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:97) at
   
 org.apache.hive.hcatalog.mapreduce.HCatRecordReader.nextKeyValue(HCatRecordReader.java:204)
  at
   org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:63)
   ... 13 more
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)