----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/26963/#review57536 -----------------------------------------------------------
love it. overall a good start. Please see if we can add tests to the initializer/ destroyer code as well. They is logic in them as the loader/ executor too.! connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java <https://reviews.apache.org/r/26963/#comment98267> headsup: this will need to change once 1551 is committed. Jarcec is still reviewing it. connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java <https://reviews.apache.org/r/26963/#comment98268> please use the code in util for this. Are we sure we dont need any tests for this? I created one ticket since I dont see tests for any of the existing connectors upgrade logic If you are up for it, https://issues.apache.org/jira/browse/SQOOP-1595 connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java <https://reviews.apache.org/r/26963/#comment98269> can we not call it KiteToConnector. I saw this convention in Hbase and Hdfs. connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java <https://reviews.apache.org/r/26963/#comment98270> will the initializer/destroyer be different for FROM? What is the plan for FROM? If I understand correctly this will not have a From is it? If that is the case, then why not call this KiteToConnector. Also please add javadoc and add notes that is here, would be useful to read what this connector does and does not support. Destination: HDFS File Format: Avro Parquet and CSV. Compression Codec: Use default Partitioner Strategy: Not supported Column Mapping: Not supported connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java <https://reviews.apache.org/r/26963/#comment98272> Instead of exception, we could encourage We should use the NullConfiguration class in this case, saying this connector does not support FROM just a thought. connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java <https://reviews.apache.org/r/26963/#comment98271> does even default make sense here? We have a enum for direction and it is already typed, so why will there ever be the the default case? public enum Direction { FROM, TO } connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java <https://reviews.apache.org/r/26963/#comment98274> heads up this will change connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java <https://reviews.apache.org/r/26963/#comment98276> what does destination mean, can we just say TO ? connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java <https://reviews.apache.org/r/26963/#comment98282> can we just make the constants a interface? It is pretty much a standard pattern to use if we need to extend some common constants in a sub entitiy. why do we need constructor for a class that holds static final strings? Might be another ticket, but it is one line change and 2 lines to delete. public interface Constants { /** * All job related configuration is prefixed with this: * <tt>org.apache.sqoop.job.</tt> */ public static final String PREFIX_CONFIG = "org.apache.sqoop.job."; public static final String JOB_ETL_NUMBER_PARTITIONS = PREFIX_CONFIG + "etl.number.partitions"; public static final String JOB_ETL_FIELD_NAMES = PREFIX_CONFIG + "etl.field.names"; public static final String JOB_ETL_OUTPUT_DIRECTORY = PREFIX_CONFIG + "etl.output.directory"; public static final String JOB_ETL_INPUT_DIRECTORY = PREFIX_CONFIG + "etl.input.directory"; } connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java <https://reviews.apache.org/r/26963/#comment98296> Q: is not the target here assumed to be always hdfs? can this be used to write to anything else? connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java <https://reviews.apache.org/r/26963/#comment98297> can we add a few comments on why this is done in this step? since it unusal to be doing this kind of logic in destroyer connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java <https://reviews.apache.org/r/26963/#comment98298> why cant be we return null ? connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java <https://reviews.apache.org/r/26963/#comment98299> can we please move thi class to a sdk? and resue it in all connectors? also nitpick please rename link to linkconfig. link has a very different meaning in sqoop connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java <https://reviews.apache.org/r/26963/#comment98300> can we be more specific on supported formats for hdfs via kite sdk, since this will evolve as kote sdk evolves with newer formats. Why cant we use a enum class from Kite itself if there is one connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java <https://reviews.apache.org/r/26963/#comment98301> please add a ticker for feature todos and add the number here if you intend to address this alter. connector/connector-kite/src/main/resources/kite-connector-config.properties <https://reviews.apache.org/r/26963/#comment98302> nitpick rename to connector configs connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestExecutor.java <https://reviews.apache.org/r/26963/#comment98303> testKiteExecutor, please rename the test class to the class it is testing for consistency connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestLoader.java <https://reviews.apache.org/r/26963/#comment98304> ditto as above connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestLoader.java <https://reviews.apache.org/r/26963/#comment98305> nit pick, please do not use job and link in cases it really means configs. they are different. - Veena Basavaraj On Oct. 20, 2014, 6:58 p.m., Qian Xu wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/26963/ > ----------------------------------------------------------- > > (Updated Oct. 20, 2014, 6:58 p.m.) > > > Review request for Sqoop. > > > Bugs: SQOOP-1588 > https://issues.apache.org/jira/browse/SQOOP-1588 > > > Repository: sqoop-sqoop2 > > > Description > ------- > > Create a basic Kite connector that can write data (i.e. from a jdbc > connection) to HDFS. > > The scope is defined as follows: > * Destination: HDFS > * File Format: Avro Parquet and CSV. > * Compression Codec: Use default > * Partitioner Strategy: Not supported > * Column Mapping: Not supported > > > Diffs > ----- > > connector/connector-kite/pom.xml PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java > PRE-CREATION > > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java > PRE-CREATION > > connector/connector-kite/src/main/resources/kite-connector-config.properties > PRE-CREATION > connector/connector-kite/src/main/resources/sqoopconnector.properties > PRE-CREATION > > connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestExecutor.java > PRE-CREATION > > connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestLoader.java > PRE-CREATION > connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION > connector/pom.xml e98a0fc > > execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java > b385926 > pom.xml f25a29f > server/pom.xml 67baaa5 > test/pom.xml 7a80710 > > Diff: https://reviews.apache.org/r/26963/diff/ > > > Testing > ------- > > New unittests included. All passed. > > > Thanks, > > Qian Xu > >
