I'm currently using Thread::Queue::Any for hashtables. If I enqueue up to
100 hashtables consecutively onto a queue and have worker threads
dequeueing them as they appear, my program works fine. However if I try
to enqueue 200 hashtables consecutively, I get the following error: Thread
failed to start: Corrupted storable string (v2.6) at Storable.pm
(autosplit into thaw.al). And if I try to enqueue 500 hashtables, I get
the errror: Thread failed to start: Bad hash at Storable.pm (autosplit
into _freeze.al). If I enqueue anymore than that, I get a seg fault and
core dump. Has anyone seen this problem before? I looked into the
Storable module to see how it works, and found it to be very complex.
My requirements are such that I may have as many as a few thousand hashtables being put on the queue consecutively...I'm wondering if this is possible using these modules?
This shouldn't happen. Do you have an example program that demonstrates the problem. And also let us know on what platform and what version of Perl you're using?
Liz