I am not familiar with JsonSerde, but your data might contain doubles that you 
didn't know about. LOAD does not check whether the data's actual types match 
the types specified in the schema.


From: Hock Zoltán [mailto:hock.zol...@gmail.com]
Sent: Thursday, September 30, 2010 8:46 AM
To: hive-user@hadoop.apache.org
Subject: java.lang.RuntimeException: java.lang.Double cannot be cast to 
java.lang.String

Hello,

I got the following exception when I made a simple query in my table.
I made the table to store email header data with the following command:

CREATE TABLE mytable (xfrom STRING, xto STRING, subject STRING, cc STRING, 
xgeoip STRING, receivedspf STRING, xoriginatingip STRING, messageid STRING, 
replyto STRING, xenvelopefrom STRING, contenttype STRING, xoriginatingip2 
STRING, version STRING, fromaddr STRING, recaddr STRING) PARTITIONED BY (year 
int,month int,day int,dat INT) ROW FORMAT SERDE 
'org.apache.hadoop.hive.contrib.serde2.JsonSerde';

then I insert some data using LOAD DATA LOCAL INPATH "/mypipe" INTO TABLE 
mytable PARTITION(....); in my java app.
the pipe contains json data of course;

Everything works fine, but when I want to get some data:
hive>add jar  hive-json-serde-0.1.jar;
hive>from mytable select xfrom where year=2010 month=9 day=30;
or
hive>from mytable select subject where year=2010 month=9 day=30;
these works fine.
BUT
hive>from mytable select xto where year=2010 month=9 day=30;
fails with the following error:
FAILED: Execution Error, return code 2 from 
org.apache.hadoop.hive.ql.exec.ExecDriver
in the log:

java.lang.RuntimeException: java.lang.Double cannot be cast to java.lang.String

        at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:188)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)







        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)

        at org.apache.hadoop.mapred.Child.main(Child.java:170)

Caused by: java.lang.ClassCastException: java.lang.Double cannot be cast to 
java.lang.String







        at 
org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaStringObjectInspector.getPrimitiveWritableObject(JavaStringObjectInspector.java:35)

        at 
org.apache.hadoop.hive.serde2.binarysortable.BinarySortableSerDe.serialize(BinarySortableSerDe.java:506)







        at 
org.apache.hadoop.hive.serde2.binarysortable.BinarySortableSerDe.serialize(BinarySortableSerDe.java:399)

        at 
org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.processOp(ReduceSinkOperator.java:162)

        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:386)







        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:598)

        at 
org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:81)

        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:386)







        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:598)

        at 
org.apache.hadoop.hive.ql.exec.FilterOperator.processOp(FilterOperator.java:73)

        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:386)







        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:598)

        at 
org.apache.hadoop.hive.ql.exec.FilterOperator.processOp(FilterOperator.java:73)

        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:386)







        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:598)

        at 
org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:43)

        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:386)







        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:598)

        at 
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:350)

        at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:171)







        ... 4 more

I don't know where is this Double variable, everything must be stored as String.
xto field contanins the To field of the email header.


Thanks
 Zoltan

Reply via email to