On Wed, Jul 24, 2013 at 10:06:59AM +0200, Andreas Abel wrote:
> For -O1 and greater, ghc seems to see that x is not mentioned in the
> where clauses and apparently lifts them out. Thus, for -O1..
> memoized_fib is also memoizing. (I ran it, this time ;-) !)
Right, I believe this is the "full laz
Sorry I screwed up. The following is indeed memoizing:
fib5 :: Int -> Integer
fib5 = \ x -> fibs !! x
where fibs = map fib [0 ..]
fib 0 = 0
fib 1 = 1
fib n = fib5 (n-2) + fib5 (n-1)
Here, the eta-expansion does not matter. But as you say, memoized_fib
below is no
On 7/22/13 7:41 PM, David Thomas wrote:
> On Mon, Jul 22, 2013 at 4:04 PM, wren ng thornton
wrote:
>> In principle, we could translate between these two forms (the f2 ==> f1
>> direction requires detecting that y does not depend on x). However, in
>> practice, the compiler has no way to decide whi
On Mon, Jul 22, 2013 at 04:04:33PM -0700, wren ng thornton wrote:
> Consider rather,
>
> f1 = let y = blah blah in \x -> x + y
>
> f2 x = let y = blah blah in x + y
>
> The former will memoize y and share it across all invocations of f1;
> whereas f2 will recompute y for each invocation
On 07/22/2013 03:41 PM, David Thomas wrote:
> I, for one, would love to have a compiler do (a) based on (b), my
> specification of (c), and the ability to pin particular things...
>
>
The reason it is a big deal to me is it sometimes the more
natural-looking (read, declarative) way of writing a fu
I, for one, would love to have a compiler do (a) based on (b), my
specification of (c), and the ability to pin particular things...
On Mon, Jul 22, 2013 at 4:04 PM, wren ng thornton wrote:
> On 7/22/13 9:06 AM, Tom Ellis wrote:
> > On Mon, Jul 22, 2013 at 07:52:06PM +1200, Chris Wong wrote:
> >
On 7/22/13 9:06 AM, Tom Ellis wrote:
> On Mon, Jul 22, 2013 at 07:52:06PM +1200, Chris Wong wrote:
>> A binding is memoized if, ignoring everything after the equals sign,
>> it looks like a constant.
>>
>> In other words, these are memoized:
> [...]
>> f = \x -> x + 1
> [...]
>> and these are
On Mon, Jul 22, 2013 at 04:16:19PM +0200, Andreas Abel wrote:
> In general, I would not trust such compiler magic, but just let-bind
> anything I want memoized myself:
>
> memoized_fib :: Int -> Integer
> memoized_fib x = fibs !! x
> where fibs = map fib [0..] -- lazily computed infinite li
On 07/22/2013 06:16 AM, Andreas Abel wrote:
> On 22.07.2013 09:52, Chris Wong wrote:
>
> True, but the essential thing to be memoized is not memoized_fib,
> which is a function, but the subexpression
>
> map fib [0..]
>
> which is an infinite list, i.e., a value.
>
> The rule must be like "in
>
>
On 22.07.2013 09:52, Chris Wong wrote:
memoized_fib :: Int -> Integer
memoized_fib = (map fib [0 ..] !!)
where fib 0 = 0
fib 1 = 1
fib n = memoized_fib (n-2) + memoized_fib (n-1)
[.. snipped ..]
Could someone explain the technical details of why this works? Why is "ma
in this case, neither of them is memoized. because they don't have any data
in expressions.
"memoized" is for constants who have data structure in its expression
2013/7/22 Tom Ellis
> On Mon, Jul 22, 2013 at 07:52:06PM +1200, Chris Wong wrote:
> > A binding is memoized if, ignoring everything a
On Mon, Jul 22, 2013 at 07:52:06PM +1200, Chris Wong wrote:
> A binding is memoized if, ignoring everything after the equals sign,
> it looks like a constant.
>
> In other words, these are memoized:
[...]
> f = \x -> x + 1
[...]
> and these are not:
>
> f x = x + 1
In what sense is the f
On Mon, Jul 22, 2013 at 12:02:54AM -0800, Christopher Howard wrote:
> > A binding is memoized if, ignoring everything after the equals sign,
> > it looks like a constant.
[...]
> Thanks. That's very helpful to know. Yet, it seems rather strange and
> arbitrary that "f x = ..." and "f = \x -> ..." w
On 07/21/2013 11:52 PM, Chris Wong wrote:
> [.. snipped ..]
>
> A binding is memoized if, ignoring everything after the equals sign,
> it looks like a constant.
>
> In other words, these are memoized:
>
> x = 2
>
> Just x = blah
>
> (x, y) = blah
>
> f = \x -> x + 1
> -- f = ...
> memoized_fib :: Int -> Integer
> memoized_fib = (map fib [0 ..] !!)
>where fib 0 = 0
> fib 1 = 1
> fib n = memoized_fib (n-2) + memoized_fib (n-1)
>
[.. snipped ..]
> Could someone explain the technical details of why this works? Why is "map
> fib [0 ..]" not recalculated
On 07/21/2013 11:19 PM, KC wrote:
> Have you tried the compiler?
>
No. Would that work differently some how?
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe
Have you tried the compiler?
On Sun, Jul 21, 2013 at 11:59 PM, Christopher Howard <
christopher.how...@frigidcode.com> wrote:
> When I previously asked about memoization, I got the impression that
> memoization is not something that just happens magically in Haskell. Yet,
> on a Haskell wiki pa
On Tue, 06 Sep 2011 15:16:09 -0400
Michael Orlitzky wrote:
> I'm working on a program where I need to compute a gajillion (171442176)
> polynomials and evaluate them more than once. This is the definition of
> the polynomial, and it is expensive to compute:
>
> > polynomial :: Tetrahedron -> (Re
Alex Rozenshteyn schrieb:
> I feel that there is something that I don't understand completely: I
> have been told that Haskell does not memoize function call, e.g.
>> slowFib 50
> will run just as slowly each time it is called. However, I have read
> that Haskell has call-by-need semantics, which
Alex,
Maybe this pdf can enlighten you a little bit about memoization and lazy
evaluation in Haskell =>
http://www.cs.uu.nl/wiki/pub/USCS2010/CourseMaterials/A5-memo-slides-english.pdf
Cheers.
Roman.-
I feel that there is something that I don't understand completely: I have
> been told that H
Alex Rozenshteyn writes:
> I understand that
>> fib50 = slowFib 50
> will take a while to run the first time but be instant each subsequent call;
> does this count as memoization?
I didn't see anybody else answering this in so many words, but I'd say
no, since you only name one particular valu
On 9/15/10 10:39 PM, Conal Elliott wrote:
Hi Alex,
In Haskell, data structures cache, while functions do not.
Exactly. What this means is that when you call (slowFib 50) Haskell does
not alter slowFib in any way to track that it maps 50 to $whatever;
however, it does track that that particul
Hi Alex,
In Haskell, data structures cache, while functions do not.
"Memoization" is conversion of functions into data structures (and then
trivially re-wrapping as functions) so as to exploit the caching property of
data structures to get caching functions.
- Conal
On Wed, Sep 15, 2010 at 11
On Wednesday 15 September 2010 22:38:48, Tim Chevalier wrote:
> On the other hand, if you wrote:
>
> let fib50 = slowFib 50 in
> fib50 + (slowFib 50)
>
> then (slowFib 50) would be evaluated twice, because there's no
> principle requiring the compiler to notice that (slowFib 50) is the
> same exp
On 9/15/10, Alex Rozenshteyn wrote:
> I feel that there is something that I don't understand completely: I have
> been told that Haskell does not memoize function call, e.g.
> > slowFib 50
> will run just as slowly each time it is called. However, I have read that
> Haskell has call-by-need sema
Very true. I was executing the large Int-based examples on a 64 bit machine.
You can of course flip over to Integer on either 32 or 64 bit machines and
alleviate the problem with undetected overflow. Of course that doesn't help
with negative initial inputs
;)
I do agree It is still probably a go
begin Edward Kmett quotation:
> The result is considerably faster:
>
> *Main> fastest_f 12380192300
> 67652175206
>
> *Main> fastest_f 12793129379123
> 120695231674999
I just thought I'd point out that running with these particular values
on a machine with a 32 bit Int will cause
On Friday 09 July 2010 01:03:48, Luke Palmer wrote:
> On Thu, Jul 8, 2010 at 4:23 PM, Daniel Fischer
wrote:
> > On Friday 09 July 2010 00:10:24, Daniel Fischer wrote:
> >> You can also use a library (e.g.
> >> http://hackage.haskell.org/package/data- memocombinators) to do the
> >> memoisation fo
Hi,
thanks for all the replies. I'm off now to try all the suggestions...
Cheers,
Ángel de Vicente
--
http://www.iac.es/galeria/angelv/
High Performance Computing Support PostDoc
Instituto de Astrofísica de Canarias
---
On Thu, Jul 8, 2010 at 5:30 PM, Angel de Vicente wrote:
> Hi,
>
> I'm going through the first chapters of the Real World Haskell book,
> so I'm still a complete newbie, but today I was hoping I could solve
> the following function in Haskell, for large numbers (n > 108)
>
> f(n) = max(n,f(n/2)+f(
You're correct in pointing out that f uses memoization inside of
itself to cache the intermediate values that it commutes, but those
values don't get shared between invocations of f; thus, if you call f
with the same value of n several times then the memo table might get
reconstructed redunda
Thanks, okay the next question is: how does the memoization work? Each
call to memo seems to construct a new array, if the same f(n) is
encountered several times in the recursive branching, it would be
computed several times. Am I wrong?
Thanks,
Mike
Gregory Crosswhite wrote:
On 7/8/10 9:17
On 7/8/10 9:17 PM, Michael Mossey wrote:
Daniel Fischer wrote:
If f has the appropriate type and the base case is f 0 = 0,
module Memo where
import Data.Array
f :: (Integral a, Ord a, Ix a) => a -> a
f n = memo ! n
where
memo = array (0,n) $ (0,0) :[(i, max i (memo!(i
`
Daniel Fischer wrote:
If f has the appropriate type and the base case is f 0 = 0,
module Memo where
import Data.Array
f :: (Integral a, Ord a, Ix a) => a -> a
f n = memo ! n
where
memo = array (0,n) $ (0,0) :
[(i, max i (memo!(i `quot` 2) + memo!(i `quot` 3)
On Thu, Jul 8, 2010 at 4:23 PM, Daniel Fischer wrote:
> On Friday 09 July 2010 00:10:24, Daniel Fischer wrote:
>> You can also use a library (e.g.
>> http://hackage.haskell.org/package/data- memocombinators) to do the
>> memoisation for you.
>
> Well, actualy, I think http://hackage.haskell.org/pa
On Friday 09 July 2010 00:10:24, Daniel Fischer wrote:
> You can also use a library (e.g.
> http://hackage.haskell.org/package/data- memocombinators) to do the
> memoisation for you.
Well, actualy, I think http://hackage.haskell.org/package/MemoTrie would be
the better choice for the moment, data
On Thursday 08 July 2010 23:30:05, Angel de Vicente wrote:
> Hi,
>
> I'm going through the first chapters of the Real World Haskell book,
> so I'm still a complete newbie, but today I was hoping I could solve
> the following function in Haskell, for large numbers (n > 108)
>
> f(n) = max(n,f(n/2)+f
Hi,
Investigating memoization inspired by replies from this thread. I
encountered something strange in the behavior of ghci. Small chance it's a
bug, it probably is a feature, but I certainly don't understand it :)
The interpreter session went as follows
GHCi, version 6.10.4: http://www.haskel
Thanks to reactions!
What do you think about such a function? This function is
still a bit dangerous (I think). I don't know how to make
sure the compiler does not lift cache to something global.
But on the other hand this use of unsafePerformIO is legit
because it doesn't alter the referential
On Sat, Sep 05, 2009 at 02:52:50AM -0700, staafmeister wrote:
> How would experienced haskellers solve this problem?
You could just memoize using an array, like in C.
import Data.Array
occurrences :: Num a => String -> String -> a
occurrences key buf = grid ! (0, 0) -- grid ! (i, j) = occurrenc
Am Samstag 05 September 2009 11:52:50 schrieb staafmeister:
> Hi,
>
> I participating in de google code jam this year and I want to try to use
> haskell. The following
> simple http://code.google.com/codejam/contest/dashboard?c=90101#s=p2
> problem
> would have the beautiful haskell solution.
>
>
Thanks for all the hints and code provided, nevertheless, it implied
another questions:
1) Am I right that MemoCombinators can be hardly ever used with hugs? If
not, which guidelines to be used for installation...
2) Is there any paper/tutorial/wiki that describes, which local
definitions/expr
Thanks for all the hints and code provided, nevertheless, it implied
another questions:
1) Am I right that MemoCombinators can be hardly ever used with hugs? If
not, which guidelines to be used for installation...
2) Is there any paper/tutorial/wiki that describes, which local
definitions/expr
On Wed, 25 Feb 2009, Luke Palmer wrote:
On Wed, Feb 25, 2009 at 10:38 AM, Dusan Kolar wrote:
I have a function a computation of which is quite expensive, it is
recursively
dependent on itself with respect to some other function values - we can
roughly
model its behaviour w
On Wed, Feb 25, 2009 at 10:38 AM, Dusan Kolar wrote:
> I have a function a computation of which is quite expensive, it is
> recursively dependent on itself with respect to some other function values -
> we can roughly model its behaviour with fib function (returns n-th number of
> Fibonacci's se
Dusan Kolar wrote:
> Nevertheless, local version does not
> work.
Restructure your code like this:
> fibL m =
> let
> allfib = 0:1:[allfib!!n + allfib!!(n+1) | n <- [0..]]
> in allfib !! m
fibL = let allfib = 0:1:[allfib!!n + allfib!!(n+1) | n <- [0..]]
in \m -> allfib !! m
i
On Fri, 2008-12-12 at 15:47 +0100, Bertram Felgenhauer wrote:
> GHC does "opportunistic CSE", when optimizations are enabled. [...]
I see. Thank you!
Mattias
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo
Mattias Bengtsson wrote:
> The program below computes (f 27) almost instantly but if i replace the
> definition of (f n) below with (f n = f (n - 1) * f (n -1)) then it
> takes around 12s to terminate. I realize this is because the original
> version caches results and only has to calculate, for ex
Am Donnerstag, 11. Dezember 2008 21:56 schrieb Bulat Ziganshin:
> Hello Daniel,
>
> Thursday, December 11, 2008, 11:49:40 PM, you wrote:
>
> sorry for noise, it seems that i know ghc worse than you
If only that were true.
I just know that GHC's optimiser can do some rather impressive stuff when
g
Am Donnerstag, 11. Dezember 2008 21:11 schrieb Bulat Ziganshin:
> Hello Daniel,
>
> Thursday, December 11, 2008, 11:09:46 PM, you wrote:
>
> you is almost right. but ghc don't share results of function calls
> despite their type. it just assumes that value of any type may use a
> lot of memory even
Am Donnerstag, 11. Dezember 2008 16:18 schrieb Mattias Bengtsson:
> The program below computes (f 27) almost instantly but if i replace the
> definition of (f n) below with (f n = f (n - 1) * f (n -1)) then it
> takes around 12s to terminate. I realize this is because the original
> version caches
Hello Mattias,
I think you will find this thread from the haskell-cafe mailing list quite
helpful.
Re: [Haskell-cafe] Memoization
http://www.mail-archive.com/haskell-cafe@haskell.org/msg09924.html
Also, the Haskell wiki contains comments about techniques for memoization
along with references
On Sun, 13 Jul 2008, Logesh Pillay wrote:
I know we can perform memoization in Haskell. The well known recursive
Fibonacci example works v. well. f(1) returns a virtually instant answer
which would not be possible otherwise.
My (probably naive) function to give the number of partitions
Logesh Pillay <[EMAIL PROTECTED]> writes:
> Why? Its as if memoization is being ignored in the haskell version.
> How to fix?
Shouldn't the definition of p' call (the memoized) p somewhere? In
other words, I can't see any memoization, you seem to just map a
normal, expensive, recursive function
On Sun, 2008-07-13 at 20:24 +0200, Logesh Pillay wrote:
> I know we can perform memoization in Haskell. The well known recursive
> Fibonacci example works v. well. f(1) returns a virtually instant
> answer which would not be possible otherwise.
>
> My (probably naive) function to give the
On 5/30/07, Isaac Dupree <[EMAIL PROTECTED]> wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Creighton Hogg wrote:
> Now maybe I'm being dense here, but would you really *want* a way in
> Haskell
> to do something like
> memo :: (a->b) -> a->b
> since it changes the semantics of the functi
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Creighton Hogg wrote:
> Now maybe I'm being dense here, but would you really *want* a way in
> Haskell
> to do something like
> memo :: (a->b) -> a->b
> since it changes the semantics of the function?
> It seems like a better abstraction would be to ha
On 5/26/07, Mark Engelberg <[EMAIL PROTECTED]> wrote:
I'd like to write a memoization utility. Ideally, it would look
something like this:
memoize :: (a->b) -> (a->b)
memoize f gives you back a function that maintains a cache of
previously computed values, so that subsequent calls with the sa
you may want to use a container like Array or Map.
most times i use an Array myself to speed things up like this.
with Map it will either be a bit tricky or you'll need to use an unsafeIO hack.
here are some functions that may help you. my favorites are Array and MapMealey.
- marc
memoizeArrayUns
sorear pointed me to this paper a while ago:
http://citeseer.ist.psu.edu/peytonjones99stretching.html
I never tried any of the code in the end, but it will probably be useful?
On 27/05/07, Mark Engelberg <[EMAIL PROTECTED]> wrote:
I'd like to write a memoization utility. Ideally, it would loo
On 5/27/07, Stefan O'Rear <[EMAIL PROTECTED]> wrote:
memofix :: ((a -> b) -> (a -> b)) -> a -> b
memofix ff = let g = memoize (ff g) in g
fib = memofix $ \fib k -> case k of
0 -> 0
1 -> 1
n -> fib (n-1) + fib (n-2)
But this way you miss pattern matching and guards? How
Stefan O'Rear wrote:
>
> memofix :: ((a -> b) -> (a -> b)) -> a -> b
> memofix ff = let g = memoize (ff g) in g
>
> fib = memofix $ \fib k -> case k of
> 0 -> 0
> 1 -> 1
> n -> fib (n-1) + fib (n-2)
Stefan, these is something missing here. Where is memoize
defined?
Erik
On Sat, May 26, 2007 at 11:41:28PM -0300, Felipe Almeida Lessa wrote:
> On 5/26/07, Mark Engelberg <[EMAIL PROTECTED]> wrote:
> >I don't see any elegant way to do this in Haskell, and I'm doubting
> >its possible. Can someone prove me wrong?
>
> Provided some sort of memoize :: (a->b) -> (a->b),
On 5/26/07, Mark Engelberg <[EMAIL PROTECTED]> wrote:
I don't see any elegant way to do this in Haskell, and I'm doubting
its possible. Can someone prove me wrong?
Provided some sort of memoize :: (a->b) -> (a->b), I'd do something like
f = memoize g where
g = recursive call to f, not g
Thanks to everyone for the answers, I'm getting a picture now. Maybe I will
give the memoizing Y combinator a try later.
On 2005-10-07 at 22:42- "Gerd M" wrote:
> >As (memory) is a function, it
> >cannot be memoized (the function can be, but not its result, which is
> >what you're after).
>
On 2005-10-07 at 22:42- "Gerd M" wrote:
> >As (memory) is a function, it
> >cannot be memoized (the function can be, but not its result, which is
> >what you're after).
> How can a funcion be memoized but not it's result (what does this mean)!?
> Since there are no side effects in Haskell why
There are too many issues with memoisation to make it completely
automatic. There are however, ways to construct tools to turn
functions into memoising functions selectively.
This paper should be useful to you:
http://research.microsoft.com/Users/simonpj/Papers/weak.htm
It contains a number of imp
This works, thanks a lot, you saved my day/night! :-)
As (memory) is a function, it
cannot be memoized (the function can be, but not its result, which is
what you're after).
How can a funcion be memoized but not it's result (what does this mean)!?
Since there are no side effects in Haskell why
Gerd M wrote:
> I still don't get it. I changed my code to structurally match your example
> (see below) but the result is still the same...
>
> f 1 s (HMM s0 _ sts) = s ??? sts s0
> f t s hmm = memory hmm Map.! (t,s)
>
> memory hmm@(HMM s0 sss sts)
> = Map.fromList [ ((t,s),
0
mul Prob1.70.8
From: David Roundy <[EMAIL PROTECTED]>
To: haskell-cafe@haskell.org
Subject: Re: [Haskell-cafe] Memoization
Date: Fri, 7 Oct 2005 14:12:39 -0400
On Fri, Oct 07, 2005 at 06:08:48PM +, Gerd M wrote:
> I still don't get it. I changed my c
On Fri, Oct 07, 2005 at 06:08:48PM +, Gerd M wrote:
> I still don't get it. I changed my code to structurally match your
> example (see below) but the result is still the same...
How are you timing your function?
--
David Roundy
___
Haskell-Cafe mai
I still don't get it. I changed my code to structurally match your example
(see below) but the result is still the same...
f 1 s (HMM s0 _ sts) = s ??? sts s0
f t s hmm = memory hmm Map.! (t,s)
memory hmm@(HMM s0 sss sts)
= Map.fromList [ ((t,s),f' t s hmm) | t <- [1..100],
On 10/7/05, Gerd M <[EMAIL PROTECTED]> wrote:
> Hello,
> I'm trying to use Data.Map to memoize computations. Unfortunately this
> didn't improve the runtime of f at all, so there must be something wrong
> with my implementation. Thanks in advance!
>
> f 1 s (HMM s0 _ sts) = s ??? sts s0
> f
73 matches
Mail list logo