Are there any known issues with Dataset's each iterator for large
dataset results (100,000 records) on CentOS?

This loop (highlighted) from my sequel_plus gem:  http://tinyurl.com/26uk6xl

It works fine on 32-bit Mac OS-X ruby 1.8.7 (2009-06-12 patchlevel
174) [i686-darwin9] and I have exported > 1 million records in one
test.

And generally works on 64-bit CentOS ruby 1.8.7 (2010-01-10 patchlevel
249) [x86_64-linux]

However, larger datasets (of around 100k records give or take) never
finish.  I've tried Ruby 1.8.6 and 1.8.7 and several versions of
Sequel from 3.9 up to and including latest release.  I have also tried
upgrading unixODBC and FreeTDS to latest versions, but no change in
behavior.

The test for Mac vs. CentOS is same code and same  MSDE SQL Server
2000 backend and database.

Any ideas on how to even trouble-shoot this one would be appreciated.
Processing simply hits about 85% and sticks there for hours until I
kill it with SIGKILL.  Its a nightly export job that ran fine for the
prior 6 months, but data size (I suspect) has apparently triggered it
to become stuck indefinitely during the last few days (with no
security patches or any other updates applied to this server).  I
guess I should say that data size is the only thing I can surmise is
the problem here at the moment.

Michael
-- 
http://codeconnoisseur.org

-- 
You received this message because you are subscribed to the Google Groups 
"sequel-talk" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/sequel-talk?hl=en.

Reply via email to