> On Feb. 23, 2017, 11:01 p.m., Attila Szabo wrote:
> > Hey Illya,
> > 
> > Thank you for you contribution. This improvement looks great from both 
> > testing and implementation POV.
> > I do hope we will see same great Sqoop patches from you in the future as 
> > well!
> > 
> > Cheers,
> > Attila

Thank you!


- Illya


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/56770/#review166601
-----------------------------------------------------------


On Feb. 22, 2017, 10:01 p.m., Illya Yalovyy wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/56770/
> -----------------------------------------------------------
> 
> (Updated Feb. 22, 2017, 10:01 p.m.)
> 
> 
> Review request for Sqoop, Jarek Cecho, Attila Szabo, and Venkat Ranganathan.
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> Currently Sqoop assumes default file system when it comes to IO operations. 
> It makes it hard to use other FileSystem implementations as source or 
> destination.
> 
> https://issues.apache.org/jira/browse/SQOOP-3136
> 
> 
> Diffs
> -----
> 
>   src/java/com/cloudera/sqoop/io/LobReaderCache.java 3394296 
>   src/java/org/apache/sqoop/hive/HiveImport.java 4828375 
>   src/java/org/apache/sqoop/hive/TableDefWriter.java c9962e9 
>   src/java/org/apache/sqoop/io/LobReaderCache.java bd75374 
>   src/java/org/apache/sqoop/io/SplittingOutputStream.java 5f98192 
>   src/java/org/apache/sqoop/lib/LargeObjectLoader.java 70c0f4e 
>   src/java/org/apache/sqoop/manager/oracle/OraOopUtilities.java e81588c 
>   src/java/org/apache/sqoop/mapreduce/CombineFileInputFormat.java e08f997 
>   src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java 260bc29 
>   src/java/org/apache/sqoop/mapreduce/ExportJobBase.java 27f84da 
>   src/java/org/apache/sqoop/mapreduce/HBaseBulkImportJob.java b32cdd1 
>   src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java 626119b 
>   src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java f911280 
>   src/java/org/apache/sqoop/mapreduce/MergeJob.java 5b6c4df 
>   src/java/org/apache/sqoop/tool/ImportTool.java 258ef79 
>   src/java/org/apache/sqoop/util/FileSystemUtil.java PRE-CREATION 
>   src/java/org/apache/sqoop/util/FileUploader.java 155cffc 
>   src/test/org/apache/sqoop/util/TestFileSystemUtil.java PRE-CREATION 
> 
> Diff: https://reviews.apache.org/r/56770/diff/
> 
> 
> Testing
> -------
> 
> ** Build:
> ant clean package
> ...
> BUILD SUCCESSFUL
> Total time: 51 seconds
> 
> 
> ** Test:
> ant test
> ...
> BUILD SUCCESSFUL
> Total time: 7 minutes 21 seconds
> 
> * On Hadoop Cluster:
> 
> ** original version of sqoop:
> sqoop import --connect <JDBC URL> --table table1 --driver <JDBC DRIVER> 
> --username root --password **** --delete-target-dir --target-dir 
> s3a://some-bucket/tmp/sqoop
> ...
> 17/02/15 19:16:59 ERROR tool.ImportTool: Imported Failed: Wrong FS: 
> s3a://some-bucket/tmp/sqoop, expected: hdfs://<DNS>:8020
> 
> ** updated version of sqoop:
> sqoop import --connect <JDBC URL> --table table1 --driver <JDBC DRIVER> 
> --username root --password **** --delete-target-dir --target-dir 
> s3a://some-bucket/tmp/sqoop
> ...
> 17/02/15 22:24:42 INFO mapreduce.Job: Running job: job_1487183144282_0004
> 17/02/15 22:24:52 INFO mapreduce.Job: Job job_1487183144282_0004 running in 
> uber mode : false
> 17/02/15 22:24:52 INFO mapreduce.Job:  map 0% reduce 0%
> 17/02/15 22:25:04 INFO mapreduce.Job:  map 25% reduce 0%
> 17/02/15 22:25:06 INFO mapreduce.Job:  map 50% reduce 0%
> 17/02/15 22:25:07 INFO mapreduce.Job:  map 75% reduce 0%
> 17/02/15 22:25:08 INFO mapreduce.Job:  map 100% reduce 0%
> 17/02/15 22:25:08 INFO mapreduce.Job: Job job_1487183144282_0004 completed 
> successfully
> 17/02/15 22:25:08 INFO mapreduce.Job: Counters: 36
> ...
> 17/02/15 22:25:08 INFO mapreduce.ImportJobBase: Retrieved 4993 records.
> 
> 
> Thanks,
> 
> Illya Yalovyy
> 
>

Reply via email to