Re: [Firebird-devel] The IBatch Interface and inline blobs

2021-07-16 Thread Alex Peshkoff via Firebird-devel

On 7/16/21 4:54 PM, Tony Whyman wrote:
So I guess that as long as the string is < 32K and you are not using a 
segmented blob then it is OK to use SQL_VARYING and not the inline 
blob facility.


On the subject of limits, IBatch does seem to have a silent limit that 
I am still exploring. I was comparing the time taken to insert 10 
rows using single inserts and the Batch interface. Using the Batch 
interface, only 4061 records were written the table, even though 
10 were added (IBatch->add).


The number 4061 was confirmed from both a read back after commit and 
by checking the IBatchCompletionState which reported both processed 
and updated 4061.


4061 seems an arbitrary number. My original test table was declared as

Create Table LotsOfData (
    RowID integer not null,
    theDate TimeStamp,
    MyText VarChar(1024),
    Primary Key (RowID)
  );

and on changing this to

Create Table LotsOfData (
    RowID integer not null,
    theDate TimeStamp,
    MyText VarChar(512),
    Primary Key (RowID)
  );

I was able to successfully write 8083 rows. I guess that there is some 
memory limit that is being hit, and the max mumber of rows that can be 
added depends on the size of each buffer added to the batch.


The problem I have is that this is a silent failure. I am checking the 
status vector returned by each IBatch->add, and no problem appears to 
be reported.


Should I report this as a bug?



Definitely yes.




Firebird-Devel mailing list, web interface at 
https://lists.sourceforge.net/lists/listinfo/firebird-devel


Re: [Firebird-devel] The IBatch Interface and inline blobs

2021-07-16 Thread Tony Whyman
So I guess that as long as the string is < 32K and you are not using a 
segmented blob then it is OK to use SQL_VARYING and not the inline blob 
facility.


On the subject of limits, IBatch does seem to have a silent limit that I 
am still exploring. I was comparing the time taken to insert 10 rows 
using single inserts and the Batch interface. Using the Batch interface, 
only 4061 records were written the table, even though 10 were added 
(IBatch->add).


The number 4061 was confirmed from both a read back after commit and by 
checking the IBatchCompletionState which reported both processed and 
updated 4061.


4061 seems an arbitrary number. My original test table was declared as

Create Table LotsOfData (
    RowID integer not null,
    theDate TimeStamp,
    MyText VarChar(1024),
    Primary Key (RowID)
  );

and on changing this to

Create Table LotsOfData (
    RowID integer not null,
    theDate TimeStamp,
    MyText VarChar(512),
    Primary Key (RowID)
  );

I was able to successfully write 8083 rows. I guess that there is some 
memory limit that is being hit, and the max mumber of rows that can be 
added depends on the size of each buffer added to the batch.


The problem I have is that this is a silent failure. I am checking the 
status vector returned by each IBatch->add, and no problem appears to be 
reported.


Should I report this as a bug?

On 16/07/2021 13:50, Alex Peshkoff via Firebird-devel wrote:

On 7/16/21 12:39 PM, Tony Whyman wrote:
I have recently added support for the IBatch interface to IBX and the 
good news that it all appears to work fine. However, I am still 
puzzled by why inline blobs exist.


For as long as I can remember, Firebird has allowed you to over-ride 
the input metadata SQLType to any other type that can be converted to 
the actual column type. For example, SQL_BLOB to SQL_VARYING. You can 
then pass a relatively short blob string as an SQL_VARYING (character 
id to OCTETS for binary, and otherwise as appropriate) and with no 
need to go to all the hard work of actually creating a blob,etc. I 
assume that it is on the server side that the string gets written 
into blob storage and would not expect it to be otherwise.


In all my testing, passing blob text as SQL_VARYING works fine for 
both normal update/insert operations and for batches using IBatch. So 
I am puzzled as to why the IBatch inline blob exists. What am I missing?




In some field one can have typically small blobs, but some of them may 
be bigger than maximum string size (btw, 32K is also not too big 
object currently). If you use segmented blobs it's also problematic to 
send them to server in appropriate form using strings.





Firebird-Devel mailing list, web interface at 
https://lists.sourceforge.net/lists/listinfo/firebird-devel





Firebird-Devel mailing list, web interface at 
https://lists.sourceforge.net/lists/listinfo/firebird-devel


Re: [Firebird-devel] The IBatch Interface and inline blobs

2021-07-16 Thread Alex Peshkoff via Firebird-devel

On 7/16/21 12:39 PM, Tony Whyman wrote:
I have recently added support for the IBatch interface to IBX and the 
good news that it all appears to work fine. However, I am still 
puzzled by why inline blobs exist.


For as long as I can remember, Firebird has allowed you to over-ride 
the input metadata SQLType to any other type that can be converted to 
the actual column type. For example, SQL_BLOB to SQL_VARYING. You can 
then pass a relatively short blob string as an SQL_VARYING (character 
id to OCTETS for binary, and otherwise as appropriate) and with no 
need to go to all the hard work of actually creating a blob,etc. I 
assume that it is on the server side that the string gets written into 
blob storage and would not expect it to be otherwise.


In all my testing, passing blob text as SQL_VARYING works fine for 
both normal update/insert operations and for batches using IBatch. So 
I am puzzled as to why the IBatch inline blob exists. What am I missing?




In some field one can have typically small blobs, but some of them may 
be bigger than maximum string size (btw, 32K is also not too big object 
currently). If you use segmented blobs it's also problematic to send 
them to server in appropriate form using strings.





Firebird-Devel mailing list, web interface at 
https://lists.sourceforge.net/lists/listinfo/firebird-devel


[Firebird-devel] The IBatch Interface and inline blobs

2021-07-16 Thread Tony Whyman
I have recently added support for the IBatch interface to IBX and the 
good news that it all appears to work fine. However, I am still puzzled 
by why inline blobs exist.


For as long as I can remember, Firebird has allowed you to over-ride the 
input metadata SQLType to any other type that can be converted to the 
actual column type. For example, SQL_BLOB to SQL_VARYING. You can then 
pass a relatively short blob string as an SQL_VARYING (character id to 
OCTETS for binary, and otherwise as appropriate) and with no need to go 
to all the hard work of actually creating a blob,etc. I assume that it 
is on the server side that the string gets written into blob storage and 
would not expect it to be otherwise.


In all my testing, passing blob text as SQL_VARYING works fine for both 
normal update/insert operations and for batches using IBatch. So I am 
puzzled as to why the IBatch inline blob exists. What am I missing?


Tony Whyman

MWA



Firebird-Devel mailing list, web interface at 
https://lists.sourceforge.net/lists/listinfo/firebird-devel