Hey man,

I'm not sure this will solve your issue, but the code was changed around a
bit Sqoop 1.4.6. Could you try upgrading?

-Abe

On Fri, Jul 17, 2015 at 3:19 AM, André Pinto <[email protected]>
wrote:

> I think the problem might be in:
>
> org.apache.sqoop.mapreduce.JdbcExportJob#configureInputFormat
>
> as the population of columnTypeInts and consequentially columnTypes, never
> takes in consideration the --columns modifier.
>
> On Fri, Jul 17, 2015 at 1:31 AM, André Pinto <[email protected]>
> wrote:
>
>> Hi,
>>
>> I'm calling sqoop export with the columns argument in order to discard
>> the surrogate key for the table (which is created automatically by the
>> database) like this:
>>
>> sqoop export --connect CONN_STR --username USER --password PASS
>> --export-dir HDFS_PATH --table TARGET_TABLE --columns ALL_BUT_ID_CSV
>> --validate
>>
>> but I was getting:
>>
>> Error: java.io.IOException: Cannot find field id in Avro schema
>>
>> I didn't understand why Sqoop needed the Avro schema to contain something
>> that I was not interested in exporting just because the table happens to
>> have that field, but anyway, I added a nullable field "id" to the Avro
>> schema and tried again.
>>
>> The problem now occurs while trying to set the field "id" in the record
>> representation (SqoopRecord) generated by Sqoop (code generated to the tmp
>> compile folder and in a jar file with the name of the table). I've checked
>> the SqoopRecord jar and it obviously doesn't have support for setting the
>> field id in the  void setField(String paramString, Object paramObject)
>> method, as that field is not in the list of fields that I give to Sqoop. So
>> I get this exception and the job breaks:
>>
>> 2015-07-16 15:48:34,602 FATAL [IPC Server handler 2 on 53110]
>> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
>> attempt_1436154125539_97746_m_000000_0 - exited :
>> java.lang.RuntimeException: No such field: id
>>         at invalidproduct_tmp.setField(invalidproduct_tmp.java:409)
>>         at
>> org.apache.sqoop.mapreduce.AvroExportMapper.toSqoopRecord(AvroExportMapper.java:120)
>>         at
>> org.apache.sqoop.mapreduce.AvroExportMapper.map(AvroExportMapper.java:104)
>>         at
>> org.apache.sqoop.mapreduce.AvroExportMapper.map(AvroExportMapper.java:49)
>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>>         at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
>>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
>>
>> Is this a bug in Sqoop or am I doing something wrong here? How can I
>> select which values to export as columns in the table without getting into
>> these problems?
>>
>> I'm using PostgreSQL for the database and
>> Sqoop 1.4.4.2.0.6.0-76
>> git commit id 6dae6dc8ed5473586f650f80816277854c7dd44a
>>
>> Thanks in advance,
>> André.
>>
>
>

Reply via email to