Hi Mike,

  The code throws an IllegalDataException only when the column value is <
0.
  I tried with a quick Unit test and it works absolutely fine. Can you
please confirm if the data is right and also , can you please share the
schema of the table.

Regards
Ravi

On Mon, Nov 17, 2014 at 10:26 PM, Ravi Kiran <maghamraviki...@gmail.com>
wrote:

> Hi Mike,
>
>     UNSIGNED_FLOAT is mapped to Float dataype of Pig.
> https://github.com/apache/phoenix/blob/4.0/phoenix-pig/src/main/java/org/apache/phoenix/pig/util/TypeUtil.java#L77
>     There seems to be an issue. I have raised this ticket
> https://issues.apache.org/jira/browse/PHOENIX-1464 .
>
> Regards
> Ravi
>
>
>
>
> On Mon, Nov 17, 2014 at 3:25 PM, Mike Friedman <mike.fried...@ds-iq.com>
> wrote:
>
>> Hi,
>>
>> I am wondering what Pig data type I should use with a Phoenix
>> UNSIGNED_FLOAT column. "float" in the load statement results in an error
>>
>> 2014-11-17 14:30:55,098 FATAL [IPC Server handler 11 on 32806]
>> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
>> attempt_1415884152130_0045_m_000000_0 - exited :
>> org.apache.phoenix.schema.IllegalDataException
>>         at
>> org.apache.phoenix.schema.PDataType$UnsignedFloatCodec.decodeFloat(PDataType.java:6430)
>>         at
>> org.apache.phoenix.schema.PDataType$7.toObject(PDataType.java:1051)
>>         at
>> org.apache.phoenix.schema.PDataType$7.toObject(PDataType.java:924)
>>         at
>> org.apache.phoenix.schema.PDataType.toObject(PDataType.java:6914)
>>         at
>> org.apache.phoenix.schema.PDataType$20.toObject(PDataType.java:2791)
>>         at
>> org.apache.phoenix.schema.PDataType.toObject(PDataType.java:6930)
>>         at
>> org.apache.phoenix.compile.ExpressionProjector.getValue(ExpressionProjector.java:75)
>>         at
>> org.apache.phoenix.jdbc.PhoenixResultSet.getObject(PhoenixResultSet.java:482)
>>         at
>> org.apache.phoenix.pig.hadoop.PhoenixRecord.read(PhoenixRecord.java:94)
>>         at
>> org.apache.phoenix.pig.hadoop.PhoenixRecordReader.nextKeyValue(PhoenixRecordReader.java:130)
>>         at
>> org.apache.phoenix.pig.PhoenixHBaseLoader.getNext(PhoenixHBaseLoader.java:190)
>>         at
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
>>         at
>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
>>         at
>> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
>>         at
>> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> Thanks.
>>
>>
>> Mike
>>
>
>

Reply via email to