If output file is not too big then ^A can be replaced by using simple
command like-

$ tr "\001" "," < src_file > out_file

Thanks,
Vinod

On Tue, Aug 7, 2012 at 10:27 AM, zuohua zhang <zuo...@gmail.com> wrote:

> Thanks so much!!!!!!!!! that did work. I have 200+ columns so it is quite
> an ugly thing. No shortcut?
>
>
> On Mon, Aug 6, 2012 at 9:50 PM, Vinod Singh <vi...@vinodsingh.com> wrote:
>
>> Change the query to something like-
>>
>> INSERT OVERWRITE DIRECTORY '/outputable.txt'
>> select concat(col1, ',', col2, ',', col3)  from myoutputtable;
>>
>> That way columns will be separated by ,.
>>
>> Thanks,
>> Vinod
>>
>>
>> On Tue, Aug 7, 2012 at 10:16 AM, zuohua zhang <zuo...@gmail.com> wrote:
>>
>>> I used the following that it won't help?
>>>
>>> ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
>>>
>>> On Mon, Aug 6, 2012 at 9:43 PM, Vinod Singh <vi...@vinodsingh.com>wrote:
>>>
>>>> Columns of a Hive table are separated by ^A character. Instead of doing
>>>> a "SELECT * ", you may like to use concat function to have a separator of
>>>> your choice.
>>>>
>>>> Thanks,
>>>> Vinod
>>>>
>>>>
>>>> On Tue, Aug 7, 2012 at 9:39 AM, zuohua zhang <zuo...@gmail.com> wrote:
>>>>
>>>>> I have used the following to output a hive table to a file:
>>>>> DROP TABLE IF EXISTS myoutputable;
>>>>> CREATE TABLE myoutputtable
>>>>> ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
>>>>> STORED AS TEXTFILE
>>>>> AS
>>>>> select
>>>>> *
>>>>> from originaltable;
>>>>> INSERT OVERWRITE DIRECTORY '/outputable.txt'
>>>>> select * from myoutputtable;
>>>>>
>>>>> then i used
>>>>> hadoop dfs -getmerge /outputtable.txt /mnt/
>>>>>
>>>>> but the /mnt/outputtable.txt file shows strange characters ^A in the
>>>>> file. What did I do wrong?
>>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to