[
https://issues.apache.org/jira/browse/SQOOP-443?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13268925#comment-13268925
]
Hudson commented on SQOOP-443:
------------------------------
Integrated in Sqoop-ant-jdk-1.6 #111 (See
[https://builds.apache.org/job/Sqoop-ant-jdk-1.6/111/])
SQOOP-443. Calling sqoop with hive import is not working multiple times due
to kept output directory
(Jarek Jarcec Cecho via Kathleen Ting) (Revision 1334328)
Result = SUCCESS
kathleen :
Files :
* /sqoop/trunk/src/java/org/apache/sqoop/hive/HiveImport.java
> Calling sqoop with hive import is not working multiple times due to kept
> output directory
> ------------------------------------------------------------------------------------------
>
> Key: SQOOP-443
> URL: https://issues.apache.org/jira/browse/SQOOP-443
> Project: Sqoop
> Issue Type: Improvement
> Affects Versions: 1.4.0-incubating, 1.4.1-incubating
> Reporter: Jarek Jarcec Cecho
> Assignee: Jarek Jarcec Cecho
> Priority: Minor
> Fix For: 1.4.2-incubating
>
> Attachments: SQOOP-443.patch, SQOOP-443.patch
>
>
> Hive is not removing input directory when doing "LOAD DATA" command in all
> cases. This input directory is actually sqoop's export directory. Because
> this directory is kept, calling same sqoop command twice is failing on
> exception "org.apache.hadoop.mapred.FileAlreadyExistsException: Output
> directory $table already exists".
> This issue might be easily overcome by manual directory removal, however it's
> putting unnecessary burden on users. It's also complicating executing saved
> jobs as there is additional script execution needed.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira