I suspected something like this, though wasn't sure how it would behave of FB.
Different DBs handle it differently, but the issue is that a CLOB can not be
"materialized" without access to the Connection that read it. There's a
rudimentary transaction support in DFLib that should address this. You can
modify your "load" code as follows:
DataFrame data = Tx.newTransaction(fb).call(c -> {
DataFrame data = c.tableLoader(currentTable).load();
return /* convert clob */
);
Here we are wrapping FB connector invocation and Clob resolving in a
transaction. So hopefully the CLOB can be extracted error-free.
Andrus
> On Apr 20, 2021, at 7:40 AM, Jérémy DE ROYER <[email protected]>
> wrote:
>
> Hi,
>
> I update my code to use data.convertColumn but an IOException occures as soon
> as I try to read a Frontbase CLOB (like if the connection was closed ? Even
> if it’s not).
>
> Any idea ? I posted the code below if it may help
>
> Except that, that’s awesome and so simple.
>
> I remember this library at Frankfurt but I though it was mainly for
> statistics purposes.
>
> Jérémy
>
>
> Java Code
> ==
>
>
> main.java
> ==
>
> JdbcConnector fb =
> Jdbc.connector("jdbc:FrontBase://localhost/TestIndeXys_v30").driver(FBJDriver.class.getName()).userName("frontbase").build();
> JdbcConnector pg =
> Jdbc.connector("jdbc:postgresql://localhost:5432/TestIndeXys_v30").driver(Driver.class.getName()).userName("postgres").password("postgres").build();
>
> for (String currentTable : _tables)
> {
> DataFrame data = fb.tableLoader(currentTable).load();
>
> if (Arrays.asList(data.getColumnsIndex().getLabels()).contains("CONTENT"))
> {
> data = data.convertColumn("CONTENT", (Clob c) ->
> _convertClob(fb.getConnection(), c) );
> }
>
> pg.tableSaver(currentTable).mergeByPk().save(data);
>
> System.out.println("La table " + currentTable + " a été recopiée");
> }
>
>
> additional méthod
> ==
>
> private static String _convertClob(Connection connectionIn, Clob clobIn) {
>
> StringBuffer targetStringBuffer = new StringBuffer();
>
> try {
>
> System.out.println("clobIn >" + clobIn.length());
> System.out.println("connectionIn.isClosed() >" + connectionIn.isClosed());
> // System.out.println("connectionIn.getSubString() >" +
> clobIn.getSubString(1L, (int) clobIn.length()));
> //
> // FBJInputStream
> // FBJClob
>
> final Reader reader = clobIn.getCharacterStream();
> final BufferedReader bufferedReader = new BufferedReader(reader);
>
> int intValueOfChar;
>
> while(-1 != (intValueOfChar = bufferedReader.read()))
> {
> targetStringBuffer.append((char)intValueOfChar);
> }
>
> bufferedReader.close();
> }
> catch (Exception e) {
> e.printStackTrace();
> }
>
> return targetStringBuffer.toString();
> }
>
>
> Exception
> ==
>
> clobIn >0
> connectionIn.isClosed() >false
> clobIn >170
> connectionIn.isClosed() >false
> java.io.IOException
> at com.frontbase.jdbc.FBJInputStream.read(Unknown Source)
> at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
> at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
> at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
> at java.io.InputStreamReader.read(InputStreamReader.java:184)
> at java.io.BufferedReader.fill(BufferedReader.java:161)
> at java.io.BufferedReader.read(BufferedReader.java:182)
> at
> your.app.SellAndPepperDatabaseMigration._convertClob(SellAndPepperDatabaseMigration.java:106)
> at
> your.app.SellAndPepperDatabaseMigration.lambda$0(SellAndPepperDatabaseMigration.java:75)
> at
> com.nhl.dflib.series.ColumnMappedSeries.doMaterialize(ColumnMappedSeries.java:50)
> at
> com.nhl.dflib.series.ColumnMappedSeries.materialize(ColumnMappedSeries.java:38)
> at com.nhl.dflib.series.ColumnMappedSeries.get(ColumnMappedSeries.java:25)
> at com.nhl.dflib.series.IndexedSeries.doMaterialize(IndexedSeries.java:62)
> at com.nhl.dflib.series.IndexedSeries.materialize(IndexedSeries.java:44)
> at com.nhl.dflib.series.IndexedSeries.get(IndexedSeries.java:31)
> at com.nhl.dflib.series.ObjectSeries.eq(ObjectSeries.java:260)
> at com.nhl.dflib.ColumnDataFrame.eq(ColumnDataFrame.java:592)
> at
> com.nhl.dflib.jdbc.connector.saver.SaveViaUpsert.update(SaveViaUpsert.java:122)
> at
> com.nhl.dflib.jdbc.connector.saver.SaveViaUpsert.doSave(SaveViaUpsert.java:98)
> at
> com.nhl.dflib.jdbc.connector.saver.SaveViaUpsert.doSave(SaveViaUpsert.java:37)
> at
> com.nhl.dflib.jdbc.connector.saver.TableSaveStrategy.lambda$save$1(TableSaveStrategy.java:36)
> at com.nhl.dflib.jdbc.connector.tx.Tx.call(Tx.java:56)
> at
> com.nhl.dflib.jdbc.connector.saver.TableSaveStrategy.save(TableSaveStrategy.java:36)
> at com.nhl.dflib.jdbc.connector.TableSaver.save(TableSaver.java:89)
> at
> your.app.SellAndPepperDatabaseMigration.main(SellAndPepperDatabaseMigration.java:78)
>
>
> FBJInputStream.read()
> ==
>
> public int read() throws IOException {
> if (this.available <= 0L) {
> return -1;
> } else {
> if (this.buffer == null) {
> this.initBuffer();
> }
>
> if (this.read == 0) {
> try {
> this.loadBuffer(this.nextOffset);
> } catch (Exception var2) {
> throw new IOException(var2.getMessage());
> }
> }
>
> --this.read;
> --this.available;
> return this.buffer[this.position++] & 255;
> }
> }
>
>
> Le 19 avr. 2021 à 22:57, Andrus Adamchik
> <[email protected]<mailto:[email protected]>> a écrit :
>
> The "power of DFLib" should help with this too :) Its main focus is
> manipulating data in DataFrame columns after all, so you can adapt the data
> you got from FB into something that can be saved to PG. E.g. [1]:
>
> data = data.convertColumn("col_name", (Clob c) -> convertClob(c));
>
> // implement this function
> String convertClob(Clob c) { ... }
>
> Andrus
>
> [1] https://nhl.github.io/dflib-docs/#changing-column-type
>
>
>
> On Apr 19, 2021, at 4:37 PM, Jérémy DE ROYER
> <[email protected]<mailto:[email protected]>> wrote:
>
> Hi Andrus,
>
> I was just testing the schema migration script so why not testing ?
>
> After editing the transfer line as
>
> pg.tableSaver(table).mergeByPk().save(data);
>
> to avoid duplication as explained in your doc...
>
> It works really great 👍
>
> The only one problem (there is always a problem) occured when conserting BLOB
> (FrontBaseSQL) to TEXT (PostgresSQL). It outputs the error above.
>
> Any simple idea ?
>
> Jérémy
>
> java.lang.RuntimeException: java.lang.RuntimeException: Error updating data
> in DB: Impossible de convertir une instance de type
> com.frontbase.jdbc.FBJClob vers le type String
> at com.nhl.dflib.jdbc.connector.tx.Tx.call(Tx.java:63)
> at
> com.nhl.dflib.jdbc.connector.saver.TableSaveStrategy.save(TableSaveStrategy.java:36)
> at com.nhl.dflib.jdbc.connector.TableSaver.save(TableSaver.java:89)
> at
> your.app.SellAndPepperDatabaseMigration.main(SellAndPepperDatabaseMigration.java:57)
> Caused by: java.lang.RuntimeException: Error updating data in DB: Impossible
> de convertir une instance de type com.frontbase.jdbc.FBJClob vers le type
> String
> at
> com.nhl.dflib.jdbc.connector.StatementBuilder.update(StatementBuilder.java:110)
> at
> com.nhl.dflib.jdbc.connector.saver.SaveViaInsert.doSave(SaveViaInsert.java:39)
> at
> com.nhl.dflib.jdbc.connector.saver.SaveViaUpsert.insert(SaveViaUpsert.java:105)
> at
> com.nhl.dflib.jdbc.connector.saver.SaveViaUpsert.doSave(SaveViaUpsert.java:52)
> at
> com.nhl.dflib.jdbc.connector.saver.SaveViaUpsert.doSave(SaveViaUpsert.java:37)
> at
> com.nhl.dflib.jdbc.connector.saver.TableSaveStrategy.lambda$save$1(TableSaveStrategy.java:36)
> at com.nhl.dflib.jdbc.connector.tx.Tx.call(Tx.java:56)
> ... 3 more
> Caused by: org.postgresql.util.PSQLException: Impossible de convertir une
> instance de type com.frontbase.jdbc.FBJClob vers le type String
> at
> org.postgresql.jdbc.PgPreparedStatement.cannotCastException(PgPreparedStatement.java:929)
> at
> org.postgresql.jdbc.PgPreparedStatement.castToString(PgPreparedStatement.java:918)
> at
> org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:574)
> at
> org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:935)
> at
> com.nhl.dflib.jdbc.connector.statement.DefaultColumnBinder.bind(DefaultColumnBinder.java:37)
> at
> com.nhl.dflib.jdbc.connector.statement.StatementBinder.bind(StatementBinder.java:38)
> at
> com.nhl.dflib.jdbc.connector.statement.UpdateStatementBatch.update(UpdateStatementBatch.java:40)
> at
> com.nhl.dflib.jdbc.connector.StatementBuilder.update(StatementBuilder.java:108)
> ... 9 more
> Caused by: java.sql.SQLException
> at com.frontbase.jdbc.FBJConnection.readLOB(Unknown Source)
> at com.frontbase.jdbc.FBJClob.getSubString(Unknown Source)
> at
> org.postgresql.jdbc.PgPreparedStatement.asString(PgPreparedStatement.java:738)
> at
> org.postgresql.jdbc.PgPreparedStatement.castToString(PgPreparedStatement.java:912)
> ... 15 more
>
>
>
>
> Le 19 avr. 2021 à 20:20, Andrus Adamchik
> <[email protected]<mailto:[email protected]><mailto:[email protected]>>
> a écrit :
>
> LinkMove will definitely work, but is targeting ongoing DB synchronization
> instead of a one-off migration. I would personally deal with it in a more
> low-tech way. Since you only need to do it once, the simplest tool is the
> best. I would use DFLib library [1] to read data from one DB and write to
> another. It was also presented in Frankfurt, and has evolved a lot since
> then. Here is an example:
>
> JdbcConnector fb =
> Jdbc.connector("frontbase_url").username(..).password(..).build();
> JdbcConnector pg = Jdbc.connector("pg_url").username(..).password(..).build();
>
> for(String table : tables) {
> DataFrame data = fb.tableLoader(table).load();
> pg.tableSaver(table).save(data);
> }
> As you see it can literally be done in a few lines of code and requires no
> model.
> Andrus
> [1] https://nhl.github.io/dflib-docs/#jdbc
>
> On Apr 16, 2021, at 6:00 AM, Michael Gentry
> <[email protected]<mailto:[email protected]><mailto:[email protected]>>
> wrote:
>
> Hi Jérémy,
>
> I've never used it, but Andrus (and others) have a project called LinkMove
> which may work:
>
> https://github.com/nhl/link-move
>
> I don't see a way to re-use a Cayenne model as the source AND the target,
> but perhaps that is also an option (instead of the XML source), too.
>
>
> On Fri, Apr 16, 2021 at 4:14 AM Jérémy DE ROYER
> <[email protected]<mailto:[email protected]>>
> wrote:
>
> Sorry, please read
>
> « Migrate from FrontbaseSQL to PostgreSQL » 😄
>
> Jérémy
>
> Le 16 avr. 2021 à 10:04, Jérémy DE ROYER
> <[email protected]<mailto:[email protected]>>
> a écrit :
>
> Hi Hugi,
>
> Unfortunately, I think you’re right.
>
> I wanted to leave FrontbaseSQL too but as Andrus say, the meeting in
> Frankfurt help me to resolve some cases and finally we keep our softwares
> using FrontbaseSQL.
>
> According to your experience, is there an easy way to migrate/transfert
> from PostgreSQL to FrontBaseSQL ?
>
> It could respond to the request from developers to have access to
> programming tools with the database, not offered by FrontbaseSQL.
>
> Thank’s for this highlight,
>
> Jérémy
>
> Le 16 avr. 2021 à 09:53, Hugi Thordarson
> <[email protected]<mailto:[email protected]>> a écrit :
>
> Hi Jérémy,
>
> I believe you hit a bug in the FrontBase JDBC driver:
> https://issues.apache.org/jira/browse/CAY-2574 <
> https://issues.apache.org/jira/browse/CAY-2574>
>
> Andrus wrote to FrontBase about this after our Frankfurt meetup:
>
> https://mail-archives.apache.org/mod_mbox/cayenne-dev/201905.mbox/%3C52EC2486-4736-4854-AE49-A7CF77904E52%40objectstyle.org%3E
> <https://mail-archives.apache.org/mod_mbox/cayenne-dev/201905.mbox/browser
>
>
> I haven't seen anything from FB about this since this thread, perhaps
> you should give their support a check?
> I also wrote to them last year about their JDBC driver not working on
> more recent Java versions, but don't think they've fixed it. I ended up
> migrating that one last FrontBase app to Postgres.
>
> Cheers,
> - hugi
>
>
>
> On 15 Apr 2021, at 20:59, Jérémy DE ROYER <[email protected]>
> wrote:
>
> Hi John,
>
> I know that FrontbaseSQL is not commonly used but our all apps are
> based on.
>
> I’ve just :
> - created the model explained on the doc
> https://cayenne.apache.org/docs/3.0/modeling-single-table-inheritance.html
> - wrote the code below inside the ‘Application’ class of a simple woapp
>
> But yes, maybe Cayenne is not done to work with FrontbaseSQL… as it
> works well with PostregSQL.
>
> Thank’s for your answer and have a nice day,
>
> Jérémy
>
> Le 15 avr. 2021 à 22:50, John Huss <[email protected]<mailto:
> [email protected]>> a écrit :
>
> I don't think FrontBase is very commonly used. I would recommend
> submitting
> a minimal example of the problem and someone should be able to fix the
> issue then.
>
> On Thu, Apr 15, 2021 at 3:44 PM Jérémy DE ROYER <
> [email protected]<mailto:[email protected]>>
> wrote:
>
> Hello,
>
> I switched my model from FrontBase to PostgreSQL and it works great.
>
> Does someone use Cayenne 4.1 with FrontbaseSQL successfully ?
>
> Is there something specific to know ?
>
> Thank’s for any tip,
>
> Jérémy
>
> Le 14 avr. 2021 à 23:05, Jérémy DE ROYER <[email protected]
> <mailto:[email protected]>>
> a écrit :
>
> Hi all,
>
> I’m trying to find a solution for the (temporary ?) lack of horizontal
> inheritance so I did model vertical inheritance following :
>
> https://cayenne.apache.org/docs/3.0/modeling-vertical-inheritance.html
>
> using a Frontbase database.
>
> I did :
> - create the model,
> - generate the database,
> - fix lack of jars and other things like username, password, jdbc
> driver
>
> The demo code below starts well and connects to database...
>
> try {
> ServerRuntime cayenneRuntime = ServerRuntime.builder()
> .addConfig("cayenne-CayenneTest.xml")
> .build();
>
> ObjectContext context = cayenneRuntime.newContext();
>
> Book _newBook = context.newObject(Book.class);
> _newBook.setTitle("Nouveau Book");
> _newBook.setCreated(new Date());
>
> context.commitChanges();
>
> EBook _newEBook = context.newObject(EBook.class);
> _newEBook.setTitle("Nouveau EBook");
> _newEBook.setCreated(new Date());
> _newEBook.setDownloadUrl("https://www.google.fr/");
>
> context.commitChanges();
>
> PaperBook _newPaperBook = context.newObject(PaperBook.class);
> _newPaperBook.setTitle("Nouveau PaperBook");
> _newPaperBook.setCreated(new Date());
> _newPaperBook.setShippingWeight(1000);
>
> context.commitChanges();
>
> List<Book> _books = ObjectSelect.query(Book.class).select(context);
>
> System.out.println("** ALL **");
>
> for (Book currentBook : _books)
> {
> System.out.println(currentBook.getTitle() + " created on " +
> currentBook.getCreated());
> }
> }
> catch (Exception e) {
> System.out.println("An error occured : " + e.getMessage());
> e.printStackTrace();
> }
>
> …but throws this exception
>
> An error occured : [v.4.1 Jul 14 2020 10:40:45] Commit Exception
> org.apache.cayenne.CayenneRuntimeException: [v.4.1 Jul 14 2020
> 10:40:45]
> Commit Exception
> at
>
> org.apache.cayenne.access.DataContext.flushToParent(DataContext.java:774)
> at
>
> org.apache.cayenne.access.DataContext.commitChanges(DataContext.java:691)
> at your.app.Application.<init>(Application.java:50)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at java.lang.Class.newInstance(Class.java:442)
> at com.webobjects.appserver.WOApplication.main(WOApplication.java:547)
> at your.app.Application.main(Application.java:25)
> Caused by: java.lang.NullPointerException
> at
>
> org.apache.cayenne.access.jdbc.SQLTemplateAction.execute(SQLTemplateAction.java:242)
> at
>
> org.apache.cayenne.access.jdbc.SQLTemplateAction.runWithNamedParametersBatch(SQLTemplateAction.java:179)
> at
>
> org.apache.cayenne.access.jdbc.SQLTemplateAction.performAction(SQLTemplateAction.java:111)
> at
>
> org.apache.cayenne.access.DataNodeQueryAction.runQuery(DataNodeQueryAction.java:97)
> at org.apache.cayenne.access.DataNode.performQueries(DataNode.java:273)
> at
>
> org.apache.cayenne.dba.frontbase.FrontBasePkGenerator.longPkFromDatabase(FrontBasePkGenerator.java:143)
> at
>
> org.apache.cayenne.dba.JdbcPkGenerator.generatePk(JdbcPkGenerator.java:220)
> at
>
> org.apache.cayenne.access.DataDomainInsertBucket.createPermIds(DataDomainInsertBucket.java:168)
> at
>
> org.apache.cayenne.access.DataDomainInsertBucket.appendQueriesInternal(DataDomainInsertBucket.java:76)
> at
>
> org.apache.cayenne.access.DataDomainSyncBucket.appendQueries(DataDomainSyncBucket.java:78)
> at
>
> org.apache.cayenne.access.DataDomainFlushAction.preprocess(DataDomainFlushAction.java:185)
> at
>
> org.apache.cayenne.access.DataDomainFlushAction.flush(DataDomainFlushAction.java:143)
> at
> org.apache.cayenne.access.DataDomain.onSyncFlush(DataDomain.java:624)
> at
>
> org.apache.cayenne.access.DataDomain.onSyncNoFilters(DataDomain.java:594)
> at
>
> org.apache.cayenne.access.DataDomain$DataDomainSyncFilterChain.onSync(DataDomain.java:822)
> at
>
> org.apache.cayenne.tx.TransactionFilter.lambda$onSync$0(TransactionFilter.java:61)
> at
>
> org.apache.cayenne.tx.DefaultTransactionManager$BaseTransactionHandler.performInTransaction(DefaultTransactionManager.java:183)
> at
>
> org.apache.cayenne.tx.DefaultTransactionManager$BaseTransactionHandler.performInNewTransaction(DefaultTransactionManager.java:155)
> at
>
> org.apache.cayenne.tx.DefaultTransactionManager$NestedTransactionHandler.handle(DefaultTransactionManager.java:98)
> at
>
> org.apache.cayenne.tx.DefaultTransactionManager.performInTransaction(DefaultTransactionManager.java:65)
> at
>
> org.apache.cayenne.tx.DefaultTransactionManager.performInTransaction(DefaultTransactionManager.java:43)
> at
>
> org.apache.cayenne.tx.TransactionFilter.onSync(TransactionFilter.java:61)
> at
>
> org.apache.cayenne.access.DataDomain$DataDomainSyncFilterChain.onSync(DataDomain.java:821)
> at org.apache.cayenne.access.DataDomain.onSync(DataDomain.java:581)
> at
>
> org.apache.cayenne.access.DataContext.flushToParent(DataContext.java:742)
> ... 9 more
>
> It seem’s to be related to pk generation so I try a select unqiue from
> Book in FrontBase and it worked lije a charm.
>
> Any idea ?
>
> Many thank’s,
>
> Jérémy
>
>
>
>
>
>
>
>
>
>
>