Hello,
I have some blob content in an oracle database which i sqoop out to hdfs as
external lob,
when I try to read from the lob file, it works for 80% records and for
others I see

java.io.IOException: Reader has been closed.
        at org.apache.sqoop.io.LobFile$Reader.checkForNull(LobFile.java:330)
        at org.apache.sqoop.io.LobFile$V0Reader.seek(LobFile.java:1245)
        at org.apache.sqoop.lib.LobRef.getDataStream(LobRef.java:199)
        at org.apache.sqoop.lib.LobRef.getDataStream(LobRef.java:157)
        at com.trgr.platform.riptide.mapreduce.MyImporter$MyDBImporter
 .extractExternalBlobContent(MyImporter.java:210)


This is how I initialize BlobRef,

String blobRefStr = new String(key.datum().getBODY().array());
BlobRef blobref = BlobRef.parse(blobRefStr);


if(blobref.isExternal()) {

line 210: try (InputStream is = blobref.getDataStream(context)) {


the avro code has

java.nio.ByteBuffer BODY;

is there something I am doing wrong here?

This MR job is to decompress blob content.

Thanks,

Vishal

Reply via email to