On 17 October 2005 08:07, Ketil Malde wrote:
> BTW, could one cheat by introducing a write barrier manually in some
> way? Perhaps by (unsafe?) thaw'ing and freeze'ing the arrays when
> they are modified?
Might be worthwhile: freezing is very quick (basically a single write),
thawing is slightly
On 14 October 2005 20:31, Jan-Willem Maessen wrote:
> That "5K" number made me immediately suspicious, so I took a look at
> the source code to Data.HashTable. Sure enough, it's allocating a
> number of large IOArrays, which are filled with pointers. The
> practical upshot is that, for a hash ta
Jan-Willem Maessen <[EMAIL PROTECTED]> writes:
> The practical upshot is that, for a hash table with (say) 24
> entries, the GC must scan an additional 1000 pointers and discover
> that each one is [].
Would a smaller default size help? In my case, I just wanted HTs for
very sparse tables.
> [C
On Fri, Oct 14, 2005 at 04:29:37PM +0100, Simon Marlow wrote:
> I'm not certain that this is your problem, but hash tables are by
> definition mutable objects, and mutable objects don't work too well with
> generational garbage collection, GHC's in particular. Basically every
> GC, even the minor
On Oct 14, 2005, at 10:17 AM, Ketil Malde wrote:
Hi all,
I have a program that uses hash tables to store word counts. It can
use few, large hash tables, or many small ones. The problem is that
it uses an inordinate amount of time in the latter case, and
profiling/-sstderr shows it is GC that
On Friday 14 Oct 2005 3:17 pm, Ketil Malde wrote:
> Hi all,
>
> I have a program that uses hash tables to store word counts. It can
> use few, large hash tables, or many small ones. The problem is that
> it uses an inordinate amount of time in the latter case, and
> profiling/-sstderr shows it is
On 14 October 2005 15:17, Ketil Malde wrote:
> I have a program that uses hash tables to store word counts. It can
> use few, large hash tables, or many small ones. The problem is that
> it uses an inordinate amount of time in the latter case, and
> profiling/-sstderr shows it is GC that is cau
Hi all,
I have a program that uses hash tables to store word counts. It can
use few, large hash tables, or many small ones. The problem is that
it uses an inordinate amount of time in the latter case, and
profiling/-sstderr shows it is GC that is causing it (accounting for
up to 99% of the time