I think you will have to write a custom code to handle this.
Regards,
Shahab
On Tue, Jul 23, 2013 at 3:50 AM, Fatih Haltas fatih.hal...@nyu.edu wrote:
At those columns, I am using uint type. I tried to cast them via sqoop
option still it gave the same error.
For other columns having type int, text etc, I am able to import them but
I have hundreds of data in uint type that I need.
While looking at some solutions, I saw that sqoop does not support uint
type, is it correct or is there any update related uint type?
Thanks you all, especially to Jarcec, you helped me a lot ;)
On Mon, Jul 22, 2013 at 7:04 PM, Jarek Jarcec Cecho jar...@apache.orgwrote:
Hi Fatih,
per JDBC documentation [1] the code stands for type OTHER which
basically means unknown. As Sqoop do not know the type, it do not know
how to transfer it to Hadoop. Would you mind sharing your table definition?
The possible workaround is to use query based import and cast the
problematic columns to known and supported data types.
Jarcec
Links:
1:
http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.OTHER
On Mon, Jul 22, 2013 at 04:03:42PM +0400, Fatih Haltas wrote:
Hi everyone,
I am trying to import data from postgre to hdfs but unfortunately, I am
taking this error. What should I do?
I would be really obliged if you can help. I am struggling more than 3
days.
---
Command that I used
---
[hadoop@ADUAE042-LAP-V ~]$ sqoop import-all-tables --direct --connect
jdbc:postgresql://192.168.194.158:5432/IMS --username pgsql -P --
--schema
LiveIPs
Result
---
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: $HADOOP_HOME is deprecated.
13/07/22 15:01:05 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
13/07/22 15:01:06 INFO manager.SqlManager: Using default fetchSize of
1000
13/07/22 15:01:06 INFO manager.PostgresqlManager: We will use schema
LiveIPs
13/07/22 15:01:06 INFO tool.CodeGenTool: Beginning code generation
13/07/22 15:01:06 INFO manager.SqlManager: Executing SQL statement:
SELECT
t.* FROM LiveIPs.2013-04-01 AS t LIMIT 1
13/07/22 15:01:06 ERROR orm.ClassWriter: Cannot resolve SQL type
13/07/22 15:01:06 ERROR orm.ClassWriter: Cannot resolve SQL type
13/07/22 15:01:06 ERROR orm.ClassWriter: No Java type for SQL type
for
column ip
13/07/22 15:01:06 ERROR orm.ClassWriter: No Java type for SQL type
for
column ip
13/07/22 15:01:06 ERROR orm.ClassWriter: No Java type for SQL type
for
column ip
13/07/22 15:01:06 ERROR orm.ClassWriter: No Java type for SQL type
for
column ip
13/07/22 15:01:06 ERROR orm.ClassWriter: No Java type for SQL type
for
column ip
13/07/22 15:01:06 ERROR orm.ClassWriter: No Java type for SQL type
for
column ip
13/07/22 15:01:06 ERROR sqoop.Sqoop: Got exception running Sqoop:
java.lang.NullPointerException
java.lang.NullPointerException
at org.apache.sqoop.orm.ClassWriter.parseNullVal(ClassWriter.java:912)
at org.apache.sqoop.orm.ClassWriter.parseColumn(ClassWriter.java:937)
at
org.apache.sqoop.orm.ClassWriter.generateParser(ClassWriter.java:1011)
at
org.apache.sqoop.orm.ClassWriter.generateClassForColumns(ClassWriter.java:1342)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1153)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:82)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390)
at
org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)