Jarek, This is a second time :-) you are asking me to open a jira on Apache that is already present on Cloudera. If you are saying that Cloudera site is no longer used: https://issues.apache.org/jira/browse/SQOOP-390?focusedCommentId=13631487&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-13631487 then why haven't all its issues been migrated in batch?
Thanks On Mon, Apr 15, 2013 at 8:58 AM, Jarek Jarcec Cecho <[email protected]>wrote: > Hi Ruslan, > I'm afraid that Sqoop currently do not supports arrays natively. The > import case can be workaround by using array_to_string function, but I'm > not sure how to easily workaround export. Would you mind opening a new JIRA > on Apache JIRA [1] for that? > > Jarcec > > Links: > 1: https://issues.apache.org/jira/browse/SQOOP > > On Tue, Apr 09, 2013 at 08:40:54PM +0400, Ruslan Al-Fakikh wrote: > > Hey guys, > > > > Sorry for raising this old question, but is there a workaround for > > uploading data with arrays in fields? > > I have a table like this: > > CREATE TABLE tablename > > ( > > image_urls character varying(300)[] > > ); > > and I am uplooading from a file on HDFS. Basically I can change a format > of > > the file, but in what form should I do it in order to make Sqoop upload > it > > to this unsupported data type? Maybe there is a workaround. > > > > Also I saw this issue, but it is still unresolved: > > https://issues.cloudera.org/browse/SQOOP-160 > > > > Any help would be appreciated > > > > > > On Fri, Aug 24, 2012 at 10:07 AM, Jarek Jarcec Cecho <[email protected] > >wrote: > > > > > You might consider utilizing Postgresql's function array_to_string to > > > "join" the array into one string. You would have to change the import > from > > > --table to --query then thought. > > > > > > Jarcec > > > > > > On Fri, Aug 24, 2012 at 10:40:46AM +0530, Adarsh Sharma wrote: > > > > Thanks Jarcec for the update, So Sqoop is not suitable for shifting > data > > > > from DB to HDFS , if some columns have integer[] or bigint[] > datatypes. > > > > > > > > Is there any way i can sh*ift date having bigint[] datatypes from* > > > postgresql > > > > DB to HDFS using Sqoop or I need to test another tool like Talend > etc. > > > > > > > > > > > > Thanks > > > > > > > > > > > > On Thu, Aug 23, 2012 at 11:45 PM, Jarek Jarcec Cecho < > [email protected] > > > >wrote: > > > > > > > > > Hi Adarsh, > > > > > as far as I know then Sqoop should not have any issues with bigint > data > > > > > type. > > > > > > > > > > Based on provided log fragment, It seems that you're having issues > with > > > > > SQL type 2003 that should be ARRAY (see 1). I'm afraid that array > is > > > really > > > > > not supported in Sqoop at the moment. > > > > > > > > > > Jarcec > > > > > > > > > > 1: > > > > > > > > > http://docs.oracle.com/javase/1.5.0/docs/api/constant-values.html#java.sql.Types.ARRAY > > > > > > > > > > On Thu, Aug 23, 2012 at 08:47:10PM +0530, Adarsh Sharma wrote: > > > > > > Hi all, > > > > > > > > > > > > Please forgive if i violate any rule before posting in this > mailing > > > list > > > > > . > > > > > > I am using for some testing in my hadoop standalone set up. > > > > > > > > > > > > Hadoop Version: 0.20.2-cdh3u5, > > > 580d1d26c7ad6a7c6ba72950d8605e2c6fbc96cc > > > > > > Sqoop Version : Sqoop 1.4.1-incubating > > > > > > Also tried : Sqoop 1.4.0-incubating > > > > > > Postgresql Versio : edb-psql (9.0.4.14) > > > > > > > > > > > > > > > > > > I am able to export data from HDFS to postgresql but when I am > > > trying to > > > > > > import data from DB to hdfs , below problem arises : > > > > > > hadoop@test123:~/project/ > > > > > > sqoop-1.4.1-incubating__hadoop-0.20$ bin/sqoop import --connect > > > > > > jdbc:postgresql://localhost/hadooppipeline --table test_table > > > --username > > > > > > postgres --password postgres > > > > > > Warning: /usr/lib/hbase does not exist! HBase imports will fail. > > > > > > Please set $HBASE_HOME to the root of your HBase installation. > > > > > > 12/08/23 19:25:19 WARN tool.BaseSqoopTool: Setting your password > on > > > the > > > > > > command-line is insecure. Consider using -P instead. > > > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Using default > fetchSize of > > > > > 1000 > > > > > > 12/08/23 19:25:19 INFO tool.CodeGenTool: Beginning code > generation > > > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Executing SQL > statement: > > > > > SELECT > > > > > > t.* FROM "test_table" AS t LIMIT 1 > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type > 2003 > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type > 2003 > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type > 2003 > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type > 2003 > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type > 2003 > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type > 2003 > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: No Java type for SQL > type > > > 2003 > > > > > for > > > > > > column rc_list > > > > > > > > > > > > There are 4 bigint columns in the table. Please guide me if Sqoop > > > support > > > > > > for bigint columns or not. > > > > > > > > > > > > I do some Rn D and find only one link but not able to solve the > > > issue : > > > > > > > > > > > > > > > https://issues.cloudera.org/browse/SQOOP-48?page=com.atlassian.jira.plugin.system.issuetabpanels%3Aall-tabpanel > > > > > > > > > > > > > > > > > > Thanks > > > > > > > > >
