> > To make a long story short, here is the library code:
> > elems arr = case bounds arr of
> > (_l, _u) -> [unsafeAt arr i | i <- [0 .. numElements arr - 1]
> >
> > And my version:
> > boundedElems arr = case bounds arr of
> > (_l, _u) -> [unsafeAt arr i | i <- [1737 .. 1752]]
> >
> >
"Brettschneider, Matthias"
writes:
> Thx for your hints, I played around with them and the performance gets
> slightly better.
> But the major boost is still missing :)
>
> I noticed, that one real bottleneck seems to be the conversion of the array
> back into a list.
> The interesting part
Thx for your hints, I played around with them and the performance gets slightly
better.
But the major boost is still missing :)
I noticed, that one real bottleneck seems to be the conversion of the array
back into a list.
The interesting part is, if I use the elems function (Data.Array.Base)
Brettschneider:
> Hey There,
>
> I am trying to write a hash-algorithm that in fact is working, but as you
> might have guessed the problem is the performance :) At the moment I am 40
> times worse than the same implementation in C.
>
> My problem is, I need mutable arrays which are the heart
Hello Matthias,
Thursday, March 19, 2009, 2:16:30 PM, you wrote:
1. use ghc -O2
2. use unsafeRead/WriteArrray
3. check that all calculations (step*, li/ri) are strict
> Hey There,
> I am trying to write a hash-algorithm that in fact is working, but
> as you might have guessed the problem is th
Hey There,
I am trying to write a hash-algorithm that in fact is working, but as you might
have guessed the problem is the performance :) At the moment I am 40 times
worse than the same implementation in C.
My problem is, I need mutable arrays which are the heart of that hash.
The algorithm i