I am trying to create a table and insert overwrite it, so the data is supposed 
to be generated.






At 2013-01-09 17:17:06,"Anurag Tangri" <tangri.anu...@gmail.com> wrote:

Hi Richard,
You should set the format in create external table command based on the format 
of your data on HDFS.


Is your data text file or seq file on HDFS ?


Thanks,
Anurag Tangri

Sent from my iPhone

On Jan 9, 2013, at 12:49 AM, Richard  <codemon...@163.com> wrote:


more information:


if I set the format as textfile, there is no tab space. 
if I set the format as sequencefile and view the content via hadoop fs -text, I 
saw a tab space in the head of each line.


At 2013-01-09 15:44:00,Richard <codemon...@163.com> wrote:

hi there


I have a problem with creating a hive table.
no matter what field delimiter I used, I always got a tab space in the head of 
each line (a line is a record).
something like this:
\t f1 \001 f2 \001 f3 ...
where f1 , f2 , f3 denotes the field value and \001 is the field separator.


here is the clause I used 
35 create external table if not exists ${HIVETBL_my_table}
 36 (
 37 nid string, 
 38 userid string, 
 39 spv bigint, 
 40 sipv bigint, 
 41 pay bigint, 
 42 spay bigint, 
 43 ipv bigint, 
 44 sellerid string, 
 45 cate string
 46 )
 47 partitioned by(ds string)
 48 row format delimited fields terminated by '\001' lines terminated by '\n'
 49 stored as sequencefile
 50 location '${HADOOP_PATH_4_MY_HIVE}/${HIVETBL_my_table}';


thanks for help.


Richard





Reply via email to