Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-04 Thread Jim Ursetto
On Feb 4, 2013, at 2:28 PM, Felix wrote: > > Perhaps, but I really don't see a problem of allowing a limit > on heap allocation in the runtime system. I think a segfault is an appropriate response to OOM, but I wonder if it's possible to panic() instead if the heap size can't be increased as nee

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-04 Thread Felix
From: Peter Bex Subject: Re: [Chicken-users] Segfault with large data-structures (bug) Date: Mon, 4 Feb 2013 00:16:47 +0100 > On Mon, Feb 04, 2013 at 12:10:16AM +0100, Felix wrote: >> > But why not just use ulimit? It can be set per process, so I don't see >> > the ne

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread John Cowan
Peter Bex scripsit: > > Not everybody uses UNIX, you know. > > I keep forgetting not everybody is "lucky" enough to use it. > > More seriously, do "modern" OSes not have some sort of sane limiting > system? ulimit must be several decades old... Windows System Resource Manager is our friend her

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Peter Bex
On Mon, Feb 04, 2013 at 12:10:16AM +0100, Felix wrote: > > But why not just use ulimit? It can be set per process, so I don't see > > the need to have a second ulimit-like limit inside each process. > > Not everybody uses UNIX, you know. I keep forgetting not everybody is "lucky" enough to use i

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Felix
From: Peter Bex Subject: Re: [Chicken-users] Segfault with large data-structures (bug) Date: Sun, 3 Feb 2013 23:47:39 +0100 > On Sun, Feb 03, 2013 at 11:37:42PM +0100, Felix wrote: >> The intention is to provide some sort of soft "ulimit" at the >> application level,

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread John Cowan
Peter Bex scripsit: > But why not just use ulimit? It can be set per process, so I don't see > the need to have a second ulimit-like limit inside each process. +1 -- John Cowan co...@ccil.org http://www.ccil.org/~cowan Dievas dave dantis; Dievas duos duonos --Lithuanian proverb Deu

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Peter Bex
On Sun, Feb 03, 2013 at 11:37:42PM +0100, Felix wrote: > The intention is to provide some sort of soft "ulimit" at the > application level, in case you want to make sure a certain maximum > amount of memory is not exceeded. Or if you want to benchmark memory > consumption, or do other whacky things

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Felix
From: Peter Bex Subject: Re: [Chicken-users] Segfault with large data-structures (bug) Date: Sun, 3 Feb 2013 12:53:16 +0100 > On Sat, Feb 02, 2013 at 08:06:41PM -0600, Jim Ursetto wrote: >> (bug found -- tl;dr see end of message) >> >> Figured it out: you're exceedin

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread John Cowan
Blunderingly I wrote: > On a 32-bit system, you can't by any means get more than a 4G memory > for any single process, short of heroic measures in the kernel that > allow you to assign the same virtual addresses to different physical > addresses at the same time. I meant, of course, "at different

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Peter Bex
On Sun, Feb 03, 2013 at 11:15:12AM -0500, John Cowan wrote: > Peter Bex scripsit: > > > Speaking of which, I wondered about this before: why do we even _have_ a > > maximum heap size? This is arbitrary and awkward. For instance, on my > > trusty old G4 iBook, 2G was way more than I actually had

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread John Cowan
Peter Bex scripsit: > Speaking of which, I wondered about this before: why do we even _have_ a > maximum heap size? This is arbitrary and awkward. For instance, on my > trusty old G4 iBook, 2G was way more than I actually had (512 MB), while > at work and on my new laptop it's a relatively small

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Christian Kellermann
* Arthur Maciel [130203 14:11]: > Oh, and just to add info from another language > > #include > #include > > using namespace std; > using namespace boost; > > typedef adjacency_list Graph; > > int main() > { > const int VERTEXES = 25; > const int EDGES = 1000; > > Graph g(VERTEXES

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Arthur Maciel
Oh, and just to add info from another language #include #include using namespace std; using namespace boost; typedef adjacency_list Graph; int main() { const int VERTEXES = 25; const int EDGES = 1000; Graph g(VERTEXES); cout << " Boost Graph Library - Inserting edges" << endl;

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Arthur Maciel
Jim, that's great! Thank you so much! I've read that facebook reached out billions of users. As I'm testing graph implementations to create a graph database, do you believe this code could handle billions nodes or I would need a lot more RAM to run it? I'm not experienced in programming so I don'

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Peter Bex
On Sat, Feb 02, 2013 at 08:06:41PM -0600, Jim Ursetto wrote: > (bug found -- tl;dr see end of message) > > Figured it out: you're exceeding the default maximal heap size, which is 2GB. Speaking of which, I wondered about this before: why do we even _have_ a maximum heap size? This is arbitrary a

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-02 Thread Jim Ursetto
OK, I patched the core and the program runs to completion. Patch forthcoming. $ ./list-partials -:d -:hm16G [debug] application startup... [debug] heap resized to 1048576 bytes [debug] stack bottom is 0x7fff6f80f4b0. [debug] entering toplevel toplevel... [debug] entering toplevel library_toplevel

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-02 Thread Jim Ursetto
On Feb 2, 2013, at 8:06 PM, Jim Ursetto wrote: > Uh oh, we've hit an actual bug now. Although we can get nodes up to > 85000 by increasing max heap size from 2GB to 8GB, it appears to bomb > after the heap exceeds 4GB, maybe indicating some 32-bit sizes > left laying around in the code. Hmm, co

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-02 Thread Jim Ursetto
(bug found -- tl;dr see end of message) Figured it out: you're exceeding the default maximal heap size, which is 2GB. For whatever reason, Chicken doesn't terminate reliably and with an error in this situation, it just tries to continue. Simply run your program with -:d to see: $ ./list-parti