[ https://issues.apache.org/jira/browse/SQOOP-2949?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16667424#comment-16667424 ]
Fero Szabo commented on SQOOP-2949: ----------------------------------- Hi [~gireeshp], As we agreed offline, I've developed tests for this fix in SQOOP-3400. I've also posted your change on review board, because it's required for the tests. Hope you don't mind! (I mentioned that you developed it, wouldn't want to steal the credit!) In any case, please feel free to review the tests if you can find the time! Bests, Fero > SQL Syntax error when split-by column is of character type and min or max > value has single quote inside it > ---------------------------------------------------------------------------------------------------------- > > Key: SQOOP-2949 > URL: https://issues.apache.org/jira/browse/SQOOP-2949 > Project: Sqoop > Issue Type: Bug > Affects Versions: 1.4.6 > Environment: Sqoop 1.4.6 > Run on Hadoop 2.6.0 > On Ubuntu > Reporter: Gireesh Puthumana > Assignee: Gireesh Puthumana > Priority: Major > > Did a sqoop import from mysql table "emp", with split-by column "ename", > which is a varchar(100) type. > +Used below command:+ > sqoop import --connect jdbc:mysql://localhost/testdb --username root > --password ***** --table emp --m 2 --target-dir /sqoopTest/5 --split-by ename; > +Ename has following records:+ > | ename | > | gireesh | > | aavesh | > | shiva' | > | jamir | > | balu | > | santosh | > | sameer | > Min value is "aavesh" and max value is "shiva'" (please note the single quote > inside max value). > When run, it tried to execute below query in mapper 2 and failed: > SELECT `ename`, `eid`, `deptid` FROM `emp` AS `emp` WHERE ( `ename` >= > 'jd聯聭聪G耀' ) AND ( `ename` <= 'shiva'' ) > +Stack trace:+ > {quote} > 2016-06-05 16:54:06,749 ERROR [main] > org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception: > com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error > in your SQL syntax; check the manual that corresponds to your MySQL server > version for the right syntax to use near ''shiva'' )' at line 1 > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:422) > at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) > at com.mysql.jdbc.Util.getInstance(Util.java:387) > at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:942) > at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3966) > at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3902) > at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2526) > at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2673) > at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2549) > at > com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1861) > at > com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1962) > at > org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111) > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235) > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553) > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) > {quote} -- This message was sent by Atlassian JIRA (v7.6.3#76005)