Below command works fine.

*  sqoop import --connect jdbc:postgresql://localhost:7432/test_db \*
*  --driver org.postgresql.Driver --username pgadmin --password
pgadmin@1234 \*
*  --table user1  \*
*  --fields-terminated-by '\001' \*
*  --lines-terminated-by '\012' \*
*  --hive-import \*
*  --hive-database test_db \*
*  --hive-table user1 \*
*  --hive-partition-key year \*
*  --hive-partition-value '2016' \*
*  --verbose \*
*  --delete-target-dir*

this is log while importing data
16/08/16 13:18:08 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE
IF NOT EXISTS `tets_db`.`user1` ( `id` INT, `name` STRING, `birth_date`
STRING) COMMENT 'Imported by sqoop on 2016/08/16 13:18:08' PARTITIONED BY
(year STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES
TERMINATED BY '\012' STORED AS TEXTFILE
16/08/16 13:18:08 DEBUG hive.TableDefWriter: Load statement: LOAD DATA
INPATH 'hdfs://xxxxx:8020/user/xxx/user1' INTO TABLE `test_db`.`user1`
PARTITION (year='2016')
16/08/16 13:18:08 INFO hive.HiveImport: Loading uploaded data into Hive

and created table looks like
*hive> describe user1;*
OK
id                      int
name                    string
birth_date              string
year                    string

# Partition Information
# col_name              data_type               comment

year                    string
Time taken: 0.103 seconds, Fetched: 9 row(s)
*hive> select * from user1;*
OK
1       mahebub   2016-08-14 14:03:44.33542       2016
2       sayed   2016-08-14 14:04:04.248561      2016
Time taken: 0.132 seconds, Fetched: 2 row(s)
hive>





On Tue, Aug 16, 2016 at 4:12 PM, Boglarka Egyed <[email protected]> wrote:

> Hi Mahebub,
>
> Based on the exception log the problem could be also caused because Hive
> metastore is not properly set up. Is it configured in hive-site.xml? Also,
> is hive-site.xml is on the class path?
>
> Documentation for Hive metastore configuration: https://cwiki.
> apache.org/confluence/display/Hive/AdminManual+MetastoreAdmin
>
> On Tue, Aug 16, 2016 at 2:36 PM, Markus Kemper <[email protected]>
> wrote:
>
>> Hello Mahebub,
>>
>> The exception you are observing might suggest one of the following:
>>
>>    - Table does not exist in the database
>>    - TableName is case sensitive and somehow not being found
>>    - Database user does not have permissions to see table
>>
>> What is the output from the following commands:
>>
>> sqoop list-tables --connect jdbc:postgresql://localhost:7432/test_db
>> --driver org.postgresql.Driver --username pgadmin --password pgadmin@1234
>>
>> sqoop eval --connect jdbc:postgresql://localhost:7432/test_db --driver
>> org.postgresql.Driver --username pgadmin --password pgadmin@1234 --query
>> "select count(*) from user1"
>>
>>
>> Markus Kemper
>> Customer Operations Engineer
>> [image: www.cloudera.com] <http://www.cloudera.com>
>>
>>
>> On Tue, Aug 16, 2016 at 7:07 AM, Mahebub Sayyed <[email protected]>
>> wrote:
>>
>>> Hello Boglarka and Markus,
>>>
>>> Thanks for reply.
>>> This is my sqoop command
>>>
>>> sqoop import --connect jdbc:postgresql://localhost:7432/test_db \
>>>   --driver org.postgresql.Driver --username pgadmin --password pgadmin@1234 
>>> \
>>>   --table user1  \
>>>   --fields-terminated-by '\001' \
>>>   --lines-terminated-by '\012' \
>>>   --hcatalog-database test \
>>>   --hcatalog-table user1 \
>>>   --hcatalog-partition-keys year,month,day \
>>>   --hcatalog-partition-values '2016,08,15' \
>>>   --verbose
>>>
>>> But I getting Error:
>>>
>>> ERROR tool.ImportTool: Encountered IOException running import job: 
>>> java.io.IOException: NoSuchObjectException(message:test.user1 table not 
>>> found)
>>>         at 
>>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97)
>>>         at 
>>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
>>>         at 
>>> org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:343)
>>>         at 
>>> org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:783)
>>>         at 
>>> org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
>>>         at 
>>> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
>>>         at 
>>> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
>>>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
>>>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
>>>         at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
>>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
>>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
>>>         at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
>>> Caused by: NoSuchObjectException(message:test.user1 table not found)
>>>         at 
>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:34980)
>>>         at 
>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:34948)
>>>         at 
>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:34879)
>>>         at 
>>> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>         at 
>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1214)
>>>         at 
>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1200)
>>>         at 
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1201)
>>>         at 
>>> org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
>>>         at 
>>> org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:105)
>>>         at 
>>> org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
>>>         at 
>>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
>>>         ... 14 more
>>>
>>> On Mon, Aug 15, 2016 at 4:35 PM, Markus Kemper <[email protected]>
>>> wrote:
>>>
>>>> Hello Mahebub,
>>>>
>>>> Bogi is correct and great answer btw.
>>>>
>>>> To the best of my knowledge, with Sqoop and Hive Partitioning the
>>>> following rules apply:
>>>> 1. With static partitions you can use either (--hive-import or
>>>> --hcatalog options)
>>>> 2. With dynamic partitions you can only use (--hcatalog options)
>>>>
>>>> Example (static):
>>>> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD
>>>> --table t1 --columns c1,c2 --where "p1 = 1" --num-mappers 1 --hive-import
>>>> --hive-database default --hive-table t1_partition --hive-partition-key
>>>> <col> --hive-partition-value <value>
>>>>
>>>> Example (dynamic):
>>>> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD
>>>> --table t1 --hcatalog-database default --hcatalog-table t1_partitioned
>>>> --num-mappers 1 --verbose --where "c1 > 1" --hive-partition-value <col>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Markus Kemper
>>>> Customer Operations Engineer
>>>> [image: www.cloudera.com] <http://www.cloudera.com>
>>>>
>>>>
>>>> On Mon, Aug 15, 2016 at 9:19 AM, Boglarka Egyed <[email protected]>
>>>> wrote:
>>>>
>>>>> Hi Mahebub,
>>>>>
>>>>> Unfortunatelly, using *--hive-partition-key* and
>>>>> *--hive-partition-value* requires each Sqoop statement to be imported
>>>>> into a single Hive partition. There is currently no support for Hive
>>>>> auto-partitioning. Instead, if a data set is to be imported into
>>>>> multiple partitions in a table, separate Sqoop statements are needed for
>>>>> insertion into each partition.
>>>>>
>>>>> However, using *--hcatalog-partition-keys *and
>>>>> *--hcatalog-partition-values *you can specify multiple static
>>>>> partition key/value pairs, please find the details in the User Guide:
>>>>> https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.ht
>>>>> ml#_sqoop_hcatalog_integration
>>>>>
>>>>> Best Regards,
>>>>> Bogi
>>>>>
>>>>> On Mon, Aug 15, 2016 at 9:51 AM, Mahebub Sayyed <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> I need to create/import  hive table having three Partitions
>>>>>> year/month/day using Sqoop. I have checked *--hive-partition-key* a
>>>>>> nd *--hive-partition-value* in sqoop. using these parameters I have
>>>>>> created partition *year* like this --hive-partition-key year
>>>>>> --hive-partition-value '2016' My question is how to pass multiple
>>>>>> values for partition-key and partition-value to create partitions like
>>>>>> year/month/day.
>>>>>>
>>>>>> --
>>>>>> *Regards,*
>>>>>> *Mahebub Sayyed*
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> *Regards,*
>>> *Mahebub Sayyed*
>>>
>>
>>
>


-- 
*Regards,*
*Mahebub Sayyed*

Reply via email to