Do you have permissions to write to this path?And make sure you are looking
into the local FS, as Stephen has specified.

Warm Regards,
Tariq
cloudfront.blogspot.com


On Tue, Jul 2, 2013 at 5:25 AM, Stephen Sprague <sprag...@gmail.com> wrote:

> you gotta admit that's kinda funny.  Your stderr output shows not only
> once but three times where it put the output and in fact how many rows it
> put there.  and to top it off it reported 'SUCCESS'.
>
> but you're saying there's nothing there?
>
> now. call me crazy but i would tend to believe hive over you - but that's
> just me. :)
>
> are you looking at the local filesystem on the same box you ran hive?
>
>
> On Mon, Jul 1, 2013 at 4:01 PM, Raj Hadoop <hadoop...@yahoo.com> wrote:
>
>> Hi,
>>
>> My requirement is to load data from a (one column) Hive view to a CSV
>> file. After loading it, I dont see any file generated.
>>
>> I used the following commands to load data to file from a view v_june1
>>
>> hive > set hive.io.output.fileformat=CSVTextFile;
>>  hive > insert overwrite local directory '/usr/home/hadoop/da1/' select *
>> from v_june1_pgnum
>>
>>
>> .....The output at console is like the below.....
>>
>>
>> MapReduce Total cumulative CPU time: 4 minutes 15 seconds 590 msec
>> Ended Job = job_201306141336_0113
>> Copying data to local directory /usr/home/hadoop/da1
>> Copying data to local directory /usr/home/hadoop/da1
>> 3281 Rows loaded to /usr/home/hadoop/da1
>> MapReduce Jobs Launched:
>> Job 0: Map: 21  Reduce: 6   Cumulative CPU: 255.59 sec   HDFS Read:
>> 5373722496 HDFS Write: 389069 SUCCESS
>> Total MapReduce CPU Time Spent: 4 minutes 15 seconds 590 msec
>> OK
>> Time taken: 148.764 second
>>
>>
>>
>> My Question : I do not see any files created under /usr/home/hadoop/da1.
>> Where are the files created?
>>
>> Thanks,
>> Raj
>>
>>
>>
>

Reply via email to