Alexander,

 

Thank you for the feedback.

 

>>Two more things...  

>> 

>>1.  Your transaction parameters please.  How do you create them, what
settings?

To be honest I've never really thought about the transaction settings other
than the default. Are there a specific settings I should be using for just
reading "stale" data?

>>2.  Did you check your "source-read" logic.  Cound you maybe fake data, so
we know if it is the "insert" that is slow.  Maybe the materializing of the
source record is slow or fetching it?

I was doing further testing last night and removed the insert logic to see
if it was the reading of the data that was slowing it down. with the removal
of the Insert Logic and just reading and looping for 5000 iterations, it
still slower than Delphi's results. For the read Logic I was using Dapper
dot net and retrieving the results un-buffered (one row at a time). I then
eliminated Dapper and it still was slower than Delphi. I then tried an OleDB
Provider and that was a little faster than the .Net provider, but still
slower than Delphi. 

 

In retrospect I might have jumped the gun in blaming the Write performance
of the .Net Provider. I think I need to try to optimize the read logic, then
move on to the Write Logic.

 

I read somewhere that Looping in .NET is somewhat slower than in Delphi, but
there are things you can do optimize the .NET loops. 

 

Thank you,

 

Edward Mendez

 

From: Alexander Muylaert-Gelein [mailto:amuylaert_gel...@hotmail.com] 
Sent: Friday, August 8, 2014 1:54 AM
To: firebird-net-provider@lists.sourceforge.
Subject: Re: [Firebird-net-provider] Questions with performance metrics
doing large inserts into DB

 

Hi Edward

 

I'm also coming from a delphi background (using FIB) and we have ported/are
already porting for 5 years our applications to .net.  I have noticed indeed
that Delphi/fib is faster then .net provider.  But never in the magnitude of
500%.  It looked acceptable slower.

 

Your "write" code seems to be correct and pretty optimal.  Usually people
recreate a command each time.  I've also done some profiling in the past and
I've noticed that keeping a reference to the parameter doesn't help much.  A
slightly slower method, but way less code is to "clear" the parameters and
recreate them.   Once again, It is slightly slower, neglectible, but in your
scenario 170 lines of code less.  

 

using (command = new command){

  var par = command.Parameters

   while (! Eof){

       par.Clear();

       par.Add(Id);

       par.Add(Value);

       ...

   }

}

 

On the other hand Firebird is an open source database and also the .net
provider.  Jiri (the guy supporting this) is getting a few bucks per month
to support this.  We, as a company, sometimes sponsor these things by
testing, benchmarking or lazy picking up the bill.  Since you have a testing
environment up and running, you might walk the extra mile and help everybody
by profiling a bit deeper?  This would benefit you, me and everybody.  

 

Two more things...  

 

1.  Your transaction parameters please.  How do you create them, what
settings?

2.  Did you check your "source-read" logic.  Cound you maybe fake data, so
we know if it is the "insert" that is slow.  Maybe the materializing of the
source record is slow or fetching it?

 

Looking forward for tackling this thing.

 

thanks

 

a

 

 

 

 

 

  _____  

From: emendez...@nc.rr.com
To: firebird-net-provider@lists.sourceforge.net
Date: Thu, 7 Aug 2014 18:14:43 -0400
Subject: Re: [Firebird-net-provider] Questions with performance metrics
doing large inserts into DB

Hello All,

 

I have to develop an application that will move "old/stale" data from
certain tables to another FB DB. We already have an existing application
that did something similar to this, but his application is written using
Delphi 5 and we are a .NET shop and wanted to develop newer applications
using .NET technologies so we can reuse our developer resources. Little by
Little we have been migrating off from Delphi5 to .NET.

 

In our shop we are running various instances of FB on 2.14 Classic on CentOS
5.6. Our Database is larger than 250GB. 

 

In past .NET projects I have used Dapper dot Net and thought that this might
fit the requirements. 

 

We developed a working prototype of what we wanted I had our testers run the
application to see what they thought. To my dismay, they informed me that
the performance was terrible.

In some cases we need to archive millions of rows to the other Database. And
it seemed using dapper was not giving us acceptable results. The users said
that using the old Delphi applications was quicker when archiving data (
that Delphi application has functionality that also archives different data
to other Databases).

 

I wanted to have a baseline test so we can compare apples to apples. 

 

I trimmed down the prototype to eliminate Dapper dot net and use straight
ADO.NET  for the Inserts using the latest Firebird.NET provider on .NET
4.5.1. 

                In the application once the data was retrieved I read it one
row at a time, because trying to read in over a million rows into memory
would cause Out of Memory issues. A sample of the code I am using is shown
below. As I looped over the results, I reported on every 1000 rows and
calculated elapsed times every 5000 rows. Running this application from
various computers in our infrastructure to various target DBs the best
performance I got was 19 seconds for 5000 rows. 

 

I then put on my Delphi Hat On and created a small app that did similar to
what the .NET app is doing and the Delphi app's performance blew away the
.NET performance with a consistent 5000 rows @ 4 seconds. This is Delphi5!
>From the year 1999. I also have to add that I am using the FIBPlus data
components from devrace. 

 

Below is an edited version of my .NET code.

 

query = @"Select *** Here is my Select Query from the Source DB *** ";

 

// This is using dapper.net to retrieve the rows I need.

var en = db_source.Query<TABLE_A_DTO>(query, new { ARCHIVE_SET_ID =
m_archiveSetId, ARCHIVE_DATA_TYPE = _tableName }, srcTransaction, buffered:
false);

int totalUpdated = 0;

JobStartTime = DateTime.Now;

try

      {

            using (FbConnection db_target =
((FbConnection)GetConnection(targetDB)))

            {

                  FbTransaction transaction = db_target.BeginTransaction();

                  m_isInTransaction = true;

 

                  using (FbCommand command = new FbCommand(_updateSQL,
db_target, transaction))

                  {

                  

                        command.Parameters.Add("TABLE_A_ID",
FbDbType.Integer);

                        

                              // There are 86 fields in the result set so i
do this 85 more times

                     

                        command.Prepare();

 

                        StepStartTime = DateTime.Now;

                        _logger.Write("ARCHIVING of " + _tableName + "
Started", 2);

                        // I get the Enumerator of the results so I can
iterate over them

                        _iEnumerator = en.GetEnumerator(); 

                        while (_iEnumerator.MoveNext())

                        {

                              rowsRead++;

                              var rec = ((TABLE_A_DTO)_iEnumerator.Current);

                              command.Parameters[0].Value = rec.TABLE_A_ID;

                                    

                                    // I do this 85 more times for each
parameter

 

                              rowsAffected = command.ExecuteNonQuery();

                              totalUpdated = rowsAffected + totalUpdated;

 

                              if (rowsRead % 1000 == 0)

                                    Console.Write(totalUpdated.ToString() +
"\r");

 

                              if (rowsRead % m_recordBufferCount == 0)

                              {

                                    _logger.Write(string.Format("Archived...
Rows Archived = {0}, Elasped Time {1}", totalUpdated.ToString("N0"),
Utils.GetElapsedTime(DateTime.Now - StepStartTime)), 3);

                                    StepStartTime = DateTime.Now;

                              }

                        }

 

Is the .NET Provider that slow? I am sure that the Database Configuration
has something to do with the performance, but If that is true I expected
that the Delphi Application show performance results on par with the .NET
performance.

 

Am I doing something blatantly wrong code? I would hate to go back to my
Boss and tell them that if performance is a factor we need to continue with
Delphi.

 

If this open source .NET Provider is not the "fastest on the Block", are
there any other 3rd party Libraries that I can use with .NET that you would
recommend? Even if it a commercial product?

 

 

Thanks,

 

Edward Mendez 

 

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
Firebird-net-provider mailing list
Firebird-net-provider@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/firebird-net-provider

Reply via email to