[
https://issues.apache.org/jira/browse/SQOOP-3181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Boglarka Egyed reopened SQOOP-3181:
-----------------------------------
Reopen JIRA because of wrong Resolution field.
> Sqoop1 (import + --incremental + --merge-key + --as-parquetfile) fails with
> (Could not find class <CLASS>.)
> -----------------------------------------------------------------------------------------------------------
>
> Key: SQOOP-3181
> URL: https://issues.apache.org/jira/browse/SQOOP-3181
> Project: Sqoop
> Issue Type: Bug
> Reporter: Markus Kemper
> Assignee: Sandish Kumar HN
> Labels: bug, fix, sqoop
>
> Sqoop1 (import + --incremental + --merge-key + --as-parquetfile) fails with
> (Could not find class <CLASS>.). See test case below
> *Test Case*
> {noformat}
> #################
> # STEP 01 - Create Table and Data
> #################
> export MYCONN=jdbc:oracle:thin:@oracle.sqoop.com:1521/db11g;
> export MYUSER=sqoop
> export MYPSWD=sqoop
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query
> "drop table t1"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query
> "create table t1 (c1 int, c2 date, c3 varchar(10), c4 timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query
> "insert into t1 values (1, sysdate, 'NEW ROW 1', sysdate)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query
> "select * from t1"
> Output:
> -------------------------------------------------------------
> | C1 | C2 | C3 | C4 |
> -------------------------------------------------------------
> | 1 | 2017-05-06 06:59:02.0 | NEW ROW 1 | 2017-05-06
> 06:59:02 |
> -------------------------------------------------------------
> #################
> # STEP 02 - Import Data into HDFS
> #################
> hdfs dfs -rm -r /user/root/t1
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table
> T1 --target-dir /user/root/t1 --incremental lastmodified --check-column C4
> --merge-key C1 --last-value '2017-01-01 00:00:00.0' --as-parquetfile
> --map-column-java C2=String,C4=String --num-mappers 1 --verbose
> hdfs dfs -ls /user/root/t1/*.parquet
> parquet-tools cat --json
> 'hdfs://namenode/user/root/t1/b65c1ca5-c8f0-44c6-8c60-8ee83161347f.parquet'
> Output:
> 17/05/06 07:01:34 INFO mapreduce.ImportJobBase: Transferred 2.627 KB in
> 23.6174 seconds (113.8988 bytes/sec)
> 17/05/06 07:01:34 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> 17/05/06 07:01:34 INFO tool.ImportTool: --last-value 2017-05-06 07:01:09.0
> ~~~~~
> -rw-r--r-- 3 root root 1144 2017-05-06 07:01
> /user/root/t1/b65c1ca5-c8f0-44c6-8c60-8ee83161347f.parquet
> ~~~~~
> {"C1":"1","C2":"2017-05-06 06:59:02.0","C3":"NEW ROW 1","C4":"2017-05-06
> 06:59:02"}
> #################
> # STEP 03 - Insert New Row and Update Existing Row
> #################
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query
> "insert into t1 values (2, sysdate, 'NEW ROW 2', sysdate)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query
> "update t1 set c3 = 'UPDATE 1', c4 = sysdate where c1 = 1"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query
> "select * from t1 order by c1"
> Output:
> -------------------------------------------------------------
> | C1 | C2 | C3 | C4 |
> -------------------------------------------------------------
> | 1 | 2017-05-06 06:59:02.0 | UPDATE 1 | 2017-05-06
> 07:04:40 |
> | 2 | 2017-05-06 07:04:38.0 | NEW ROW 2 | 2017-05-06
> 07:04:38 |
> -------------------------------------------------------------
> #################
> # STEP 04 - Import Data into HDFS and Merge changes
> #################
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table
> T1 --target-dir /user/root/t1 --incremental lastmodified --check-column C4
> --merge-key C1 --last-value '2017-05-06 07:01:09.0' --as-parquetfile
> --map-column-java C2=String,C4=String --num-mappers 1 --verbose
> Output:
> 17/05/06 07:06:43 INFO mapreduce.ImportJobBase: Transferred 2.6611 KB in
> 27.4934 seconds (99.1148 bytes/sec)
> 17/05/06 07:06:43 INFO mapreduce.ImportJobBase: Retrieved 2 records.
> 17/05/06 07:06:43 DEBUG util.ClassLoaderStack: Restoring classloader:
> java.net.FactoryURLClassLoader@121fdcee
> 17/05/06 07:06:43 INFO tool.ImportTool: Final destination exists, will run
> merge job.
> 17/05/06 07:06:43 DEBUG tool.ImportTool: Using temporary folder:
> 4bc6b65cd0194b81938f4660974ee392_T1
> 17/05/06 07:06:43 DEBUG util.ClassLoaderStack: Checking for existing class: T1
> 17/05/06 07:06:43 DEBUG util.ClassLoaderStack: Attempting to load jar through
> URL:
> jar:file:/tmp/sqoop-root/compile/6ed24910abcbc6ea38a1963bfce9a92d/codegen_T1.jar!/
> 17/05/06 07:06:43 DEBUG util.ClassLoaderStack: Previous classloader is
> java.net.FactoryURLClassLoader@121fdcee
> 17/05/06 07:06:43 DEBUG util.ClassLoaderStack: Testing class in jar: T1
> 17/05/06 07:06:43 ERROR tool.ImportTool: Import failed: java.io.IOException:
> Could not load jar
> /tmp/sqoop-root/compile/6ed24910abcbc6ea38a1963bfce9a92d/codegen_T1.jar into
> JVM. (Could not find class T1.)
> at
> org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:92)
> at
> com.cloudera.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:36)
> at org.apache.sqoop.tool.ImportTool.loadJars(ImportTool.java:120)
> at
> org.apache.sqoop.tool.ImportTool.lastModifiedMerge(ImportTool.java:456)
> at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:522)
> at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
> at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
> at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
> Caused by: java.lang.ClassNotFoundException: T1
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)