[ 
https://issues.apache.org/jira/browse/DERBY-6884?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Pendleton updated DERBY-6884:
-----------------------------------
    Attachment: DerbyIssue.java

The problem reproduces for me, just as described, using the
current head of trunk on Windows, with JDK 1.8.0_77-b03

I attached a re-formatted version of the repro program, which
was easier for me to read and follow, as "DerbyIssue.java".

I also removed the explicit load of the Derby ClientDriver which
appears to be unnecessary with the repro program, as it uses
the EmbeddedDriver and hence can run with just derby.jar.

Also, to be clear: to run the repro program, you need to edit
the program text to replace the three dots in the next line with
the name of a valid file in your test directory.

    public static final String BLOB_DATA_FILE = "...";

I used a 75 MB PDF file that I happened to have sitting around.

The program cleverly loops, counting the size of the blobs
that it has inserted, until it has more than 2 GB of them, so it
doesn't really matter what file you use, but you have to pick a file.

It would be nice to figure out a clever way to have a smaller repro,
as this repro takes several minutes to run on my system, but
for the purposes of demonstrating the bug the repro was great -- thanks!


> SYSCS_IMPORT_TABLE_LOBS_FROM_EXTFILE can't import more than Integer.MAX_VALUE 
> bytes of blob data
> ------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-6884
>                 URL: https://issues.apache.org/jira/browse/DERBY-6884
>             Project: Derby
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 10.11.1.1
>            Reporter: Edward Howe
>         Attachments: DerbyIssue.java
>
>
> Using SYSCS_EXPORT_TABLE_LOBS_TO_EXTFILE to export a table containing a blob 
> column, SYSCS_IMPORT_TABLE_LOBS_FROM_EXTFILE  will fail with a 
> NumberFormatException if the offset for a blob record is > Integer.MAX_VALUE. 
>  This is because ImportReadData.initExternalLobFile() is parsing the offset 
> as an Integer.
> The stack trace and a program to reproduce are below.
> java.lang.NumberFormatException: For input string: "2147483770"
>       at 
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) 
> ~[na:1.8.0_45]
>       at java.lang.Integer.parseInt(Integer.java:583) ~[na:1.8.0_45]
>       at java.lang.Integer.parseInt(Integer.java:615) ~[na:1.8.0_45]
>       at 
> org.apache.derby.impl.load.ImportReadData.initExternalLobFile(Unknown Source) 
> ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.load.ImportReadData.getBlobColumnFromExtFile(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at org.apache.derby.impl.load.ImportAbstract.getBlob(Unknown Source) 
> ~[derby-10.11.1.1.jar:na]
>       at org.apache.derby.impl.load.Import.getBlob(Unknown Source) 
> ~[derby-10.11.1.1.jar:na]
>       at org.apache.derby.iapi.types.SQLBlob.setValueFromResultSet(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.sql.execute.VTIResultSet.populateFromResultSet(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.sql.execute.VTIResultSet.getNextRowCore(Unknown Source) 
> ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.sql.execute.ProjectRestrictResultSet.getNextRowCore(Unknown
>  Source) ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.sql.execute.NormalizeResultSet.getNextRowCore(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.sql.execute.NoPutResultSetImpl.getNextRowFromRowSource(Unknown
>  Source) ~[derby-10.11.1.1.jar:na]
>       at org.apache.derby.impl.store.access.heap.HeapController.load(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at org.apache.derby.impl.store.access.heap.Heap.load(Unknown Source) 
> ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.store.access.RAMTransaction.loadConglomerate(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.store.access.RAMTransaction.recreateAndLoadConglomerate(Unknown
>  Source) ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.sql.execute.InsertResultSet.bulkInsertCore(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at org.apache.derby.impl.sql.execute.InsertResultSet.open(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at 
> org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown 
> Source) ~[derby-10.11.1.1.jar:na]
>       ... 36 common frames omitted
> ==================================
> package blob;
> import java.io.BufferedInputStream;
> import java.io.File;
> import java.io.FileInputStream;
> import java.io.IOException;
> import java.sql.*;
> public final class DerbyIssue {
>     // derby url
>     public static final String DBURL = "jdbc:derby:testdb;create=true";
>     // any random binary file such as a large image or document
>     public static final String BLOB_DATA_FILE = "...";
>     public static final String EXPORT_TABLE_FILE = "table-data";
>     public static final String EXPORT_BLOB_FILE = "blob-data";
>     public static void main(String... args) throws Exception {
>         final DerbyIssue test = new DerbyIssue();
>         test.run();
>     }
>     public void run() throws Exception {
>         
> Class.forName("org.apache.derby.jdbc.ClientDriver").getConstructor().newInstance();
>         try(final Connection con = DriverManager.getConnection(DBURL)) {
>             try (final Statement stmt = con.createStatement()) {
>                 stmt.execute("CREATE TABLE TESTBLOB(id BIGINT, content 
> BLOB)");
>             }
>             System.out.printf("inserting test data%n");
>             try (final PreparedStatement pstmt = con.prepareStatement("INSERT 
> INTO TESTBLOB (id, content) VALUES (?, ?)")) {
>                 long id = 1;
>                 long byteCount = 0;
>                 final File content = new File(BLOB_DATA_FILE);
>                 while (byteCount < Integer.MAX_VALUE) {
>                     insertBlob(pstmt, id, content);
>                     id++;
>                     byteCount += content.length();
>                     if (id % 100 == 0) {
>                         System.out.printf("%d%n", byteCount);
>                     }
>                 }
>                 insertBlob(pstmt, id, content);
>                 byteCount += content.length();
>                 System.out.printf("%d bytes written to testblob table%n", 
> byteCount);
>             }
>             final File exportFile = new File(EXPORT_TABLE_FILE);
>             final File blobFile = new File(EXPORT_BLOB_FILE);
>             try (final CallableStatement stmt = con.prepareCall(
>                     "CALL SYSCS_UTIL.SYSCS_EXPORT_TABLE_LOBS_TO_EXTFILE 
> (null, ?, ?, null, null, null, ?)")) {
>                 stmt.setString(1, "TESTBLOB");
>                 stmt.setString(2, exportFile.toString());
>                 stmt.setString(3, blobFile.toString());
>                 stmt.execute();
>             }
>             System.out.printf("testblob table exported%n");
>             try (final Statement stmt = con.createStatement()) {
>                 stmt.execute("TRUNCATE TABLE TESTBLOB");
>             }
>             System.out.printf("testblob table truncated%n");
>             try (final CallableStatement stmt = con.prepareCall(
>                     "CALL SYSCS_UTIL.SYSCS_IMPORT_TABLE_LOBS_FROM_EXTFILE 
> (null, ?, ?, null, null, null, 0)")) {
>                 stmt.setString(1, "TESTBLOB");
>                 stmt.setString(2, exportFile.toString());
>                 stmt.execute();
>             }
>             System.out.printf("testblob data imported%n");
>         }
>     }
>     private void insertBlob(PreparedStatement pstmt, long id, File content) 
> throws IOException, SQLException {
>         try(BufferedInputStream contentStream = new BufferedInputStream(new 
> FileInputStream(content))) {
>             pstmt.setLong(1, id);
>             pstmt.setBinaryStream(2, contentStream);
>             pstmt.executeUpdate();
>         }
>     }
> }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to