Hi
I am facing an issue with export to Oracle:
Scenario:
sqoop export from hdfs to oracle where the schema and objects in oracle has to
be mixed case.
Issue:
I created a schema (TestV500) and table (Xy) in oracle 11g using sqoop export
commands:
sqoop eval --connect jdbc:oracle:thin:@adhoc-oracle11:1521:orcl -username test
-password password -query "CREATE USER \"TestV500\" identified by \"password\""
sqoop eval --connect jdbc:oracle:thin:@adhoc-oracle11:1521:orcl -username test
-password password -query "GRANT CONNECT TO \"TestV500\""
sqoop eval --connect jdbc:oracle:thin:@adhoc-oracle11:1521:orcl -username test
-password password -query "ALTER USER \"TestV500\" QUOTA UNLIMITED ON USERS"
sqoop eval --connect jdbc:oracle:thin:@adhoc-oracle11:1521:orcl -username test
-password password -query "CREATE TABLE \"TestV500\".\"Xy\"( \"a\"
NVARCHAR2(255) DEFAULT NULL, \"x\" NUMBER(10,0) DEFAULT NULL, \"y\"
NUMBER(10,0) DEFAULT NULL )"
Installations used:
Oracle Version: 11.1.0.6.0
Sqoop Version: 1.4.5-cdh5.4.1
Ojdbc version: ojdbc6.jar
I am now trying to load the table using a tab delimited file:
/usr/bin/sudo -u hdfs sqoop export --connect
jdbc:oracle:thin:@oracle11:1521:orcl -username test -password password
"/hdfs_nfs_mount/tmp/oracle/TestV500/Xy.txt" --table "\"\"TestV500\".\"Xy\"\""
--fields-terminated-by "\t" --input-null-string null -m 1
I get this error :
15/11/12 09:33:03 INFO tool.CodeGenTool: Beginning code generation
15/11/12 09:33:03 DEBUG manager.OracleManager: Using column names query: SELECT
t.* FROM "TestV500"."Xy" t WHERE 1=0
15/11/12 09:33:03 DEBUG manager.SqlManager: Execute getColumnInfoRawQuery :
SELECT t.* FROM "TestV500"."Xy" t WHERE 1=0
15/11/12 09:33:03 DEBUG manager.OracleManager: Creating a new connection for
jdbc:oracle:thin:@adhoc-oracle11:1521:orcl, using username: test
15/11/12 09:33:03 DEBUG manager.OracleManager: No connection paramenters
specified. Using regular API for making connection.
15/11/12 09:33:03 INFO manager.OracleManager: Time zone has been set to GMT
15/11/12 09:33:03 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
15/11/12 09:33:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.*
FROM "TestV500"."Xy" t WHERE 1=0
15/11/12 09:33:04 DEBUG manager.SqlManager: Found column a of type [12, 255, 0]
15/11/12 09:33:04 DEBUG manager.SqlManager: Found column x of type [2, 10, 0]
15/11/12 09:33:04 DEBUG manager.SqlManager: Found column y of type [2, 10, 0]
15/11/12 09:33:04 DEBUG manager.OracleManager$ConnCache: Caching released
connection for jdbc:oracle:thin:@adhoc-oracle11:1521:orcl/test
15/11/12 09:33:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
java.lang.IllegalArgumentException: There is no column found in the target
table "TestV500"."Xy". Please ensure that your table name is correct.
java.lang.IllegalArgumentException: There is no column found in the target
table "TestV500"."Xy". Please ensure that your table name is correct.
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1658)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
This is the input file Xy.txt:
a 0 0
a 1 4
a 2 2
Observations:
1.
Schema and table was created successfully in oracle.
2.
If I connect to the database directly as 'test' user and run this command, it
works fine (it shows zero rows as nothing is loaded yet)
SELECT t.* FROM "TestV500"."Xy" t WHERE 1=0;
3.
If I create a schema called testv500 with a table called xy (without mixed
case), the exact same Sqoop export command, the input file loads the data in
the table.
Unless I am missing something here, I wonder if I hit a bug not being able to
handle mixed case.
Any help will be great!
Thank You,
RV