[
https://issues.apache.org/jira/browse/SQOOP-1393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14133142#comment-14133142
]
Qian Xu commented on SQOOP-1393:
--------------------------------
[~richard_zhou] I've checked your patch, a warning is expected to be shown, if
{{HIVE_HOME}} or {{HCAT_HOME}} is not set.
{code}
if (options.doHiveImport() && (options.getFileLayout() ==
SqoopOptions.FileLayout.ParquetFile)) {
+ String hiveHome = options.getHiveHome();
+ if (null != hiveHome) {
+ File hiveHomeFile = new File(hiveHome);
+ File hiveLibFile = new File(hiveHomeFile, "lib");
+ if (hiveLibFile.exists()) {
+ addDirToCache(hiveLibFile, fs, localUrls);
+ }
+ } else {
+ LOG.warn("HIVE_HOME is unset. Cannot add hive libs as dependencies.");
+ }
+ }
{code}
Please help to ensure that the environment variable check covers the case. I'd
suggest show an error message (instead of a warning), when it is not possible
to add hive related libs to jar build path, which will lead the import to a
failure.
> Import data from database to Hive as Parquet files
> --------------------------------------------------
>
> Key: SQOOP-1393
> URL: https://issues.apache.org/jira/browse/SQOOP-1393
> Project: Sqoop
> Issue Type: Sub-task
> Components: tools
> Reporter: Qian Xu
> Assignee: Richard
> Fix For: 1.4.6
>
> Attachments: patch.diff, patch_v2.diff, patch_v3.diff
>
>
> Import data to Hive as Parquet file can be separated into two steps:
> 1. Import an individual table from an RDBMS to HDFS as a set of Parquet files.
> 2. Import the data into Hive by generating and executing a CREATE TABLE
> statement to define the data's layout in Hive with Parquet format table
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)