I'm currently using standard Microsoft tools to generate the XML.
Maybe I can reduce the memory of the saving/opening operation by
writing the file whenever a clear xml chunk is completed.

--
David Rutten
[email protected]
Robert McNeel & Associates


On Feb 9, 9:03 pm, Stan <[email protected]> wrote:
> Thats very likely that it may have run out of memory. The thing is
> usually when rhino reaches a certain memory level it depends whether u
> turned on the 3gb switch but it would all together crash. however
> there was only an error when i would try to save the grasshopper file.
> Ive replicated it a couple of times however can't at the moment. the
> definition however kept working it was just that it was hard to
> recover because it would take hours to delete the clusters. - it takes
> a while for them to get erased.
>
> i doubt its my computer's fault - its a brand new dell with quad core
> processor and 4gb ram.
>
> On Feb 9, 2:32 pm, vectore <[email protected]> wrote:
>
> > Hi David and Stan
>
> > I had the same problem a while back. The problem occurred with a file
> > that was about 42 mb.
>
> > I tjecked my joblist and saw the computer was reaching max ram-level.
> > (Running XP on a MacbookPro 2.33, with 2GB Ram.)
>
> > It was possible to save in .gh which reduced the file size for a
> > while:)
>
> > What computer are you using Stan?
>
> > Tore Banke

Reply via email to