Sorry missed that second part, I ran it
$ hadoop dfs -cat /data/sample/*
1227422134|2|1|paid:44519,tax:2120,value:42399
On Nov 25, 2008, at 7:09 PM, Joydeep Sen Sarma wrote:
Can you please send the output of 'describe extended
activity_test'. This will help us understand what's happening
Table(tableName:activity_test,dbName:default,owner:josh,createTime:
1227667482,lastAccessTime:0,retention:0,sd:StorageDescriptor(cols:
[FieldSchema(name:occurred_at,type:int,comment:null), FieldSchema
(name:actor_id,type:int,comment:null), FieldSchema
(name:actee_id,type:int,comment:null), Fiel
Can you please send the output of 'describe extended activity_test'. This will
help us understand what's happening with all the create table parameters.
Also - as a sanity check - can you please check hadoop dfs -cat /data/sample/*
(to make sure data got loaded/moved into that dir)
-Origi
hive> CREATE EXTERNAL TABLE activity_test
> (occurred_at INT, actor_id INT, actee_id INT, properties
MAP)
> ROW FORMAT DELIMITED
> FIELDS TERMINATED BY '124'
> COLLECTION ITEMS TERMINATED BY '44'
> MAP KEYS TERMINATED BY '58'
> LINES TERMINATED BY '10'
> STORED AS TE
Can you try putting the ascii value within quotes, so for example FIELDS
TERMINATED BY '124' etc...
You can also look at the following file in the source to see an example of how
this is done
ql/src/test/queries/clientpositive/input_dynamicserde.q
Ashish
-Original Message-
From: Josh
Ok so I'm trying to create an external table and load a delimited
file into it, then just do a basic select out of it, here is a
description of my scenario along with steps and results I took.
Hopefully someone can help me figure out what I'm doing wrong.
# Sample.tab
1227422134|2|1|paid:4