Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 
--password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  
--hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive 
table in hive by using command
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive


Please suggest
Yogesh Kumar

________________________________
From: Bejoy Ks [bejoy...@yahoo.com]
Sent: Thursday, July 05, 2012 5:29 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

Please try out this command

 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 
--password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  
--hive-import --hive-home HADOOP/hive --verbose


Regards
Bejoy KS

________________________________
From: "yogesh.kuma...@wipro.com" <yogesh.kuma...@wipro.com>
To: user@hive.apache.org; bejoy...@yahoo.com
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading

Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose 
option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 
--password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  
--hive-import --hive-home HADOOP/hive



Please help


________________________________
From: Bejoy Ks [bejoy...@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose 
option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are 
using the same hive for both SQOOP import and then for verifying data using 
hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS

________________________________
From: "yogesh.kuma...@wipro.com" <yogesh.kuma...@wipro.com>
To: user@hive.apache.org
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading

Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row 
of data

I want to upload that table into Hive using Sqoop tool.
I used this command


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 
--password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  
--hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect 
jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table 
Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home 
HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the 
command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for 
output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with 
--fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming 
resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* 
FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is 
/HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: 
/HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java 
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: 
/tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing 
from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use 
the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a 
MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to 
convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT 
MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 
seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import 
process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* 
FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration 
in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history 
file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar

Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email.
www.wipro.com



Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email. 

www.wipro.com

Reply via email to