Chris asked me to post the following, so others may have a chance to
review it. He's having problem posting to the mailing list.
-----------------------
I have 2 different projects that use series objects and both are
giving me enormous trouble with both 4.2.0 and 4.2.1 openDX on WinXP.
The same problems are there after Service pack 1 as before.
Just today, I located a friendly user who runs XFree86 on XP on a
Dell laptop with 512Mb, and we reproduced the problem exactly, which
appears to eliminate Exceed as the source. I also ran the same stuff
on another user's Dell laptop, this one with XP/SP1, Exceed, and 1Gb
Ram. Naturally, both laptops are single processor. My desktop is dual
processor, XP/SP1, Exceed 7.1.0.0 (3D), 2Gb Ram. The 512Mb user
purchased your CD so has 4.2.0; the other guy (Gerd Heber) is the one
I'm working with on compiling new modules using your package and your
4.2.1 build. However, I rolled back to 4.2.0, deleted the custom
module we created, and the exact same symptoms are still there, see
below.
What happens is this, with some differences in the actual numbers:
We start DX.
On the 512Mb laptop, DX says it is using about 448Mb (the normal 7/8
or RAM or whatever).
On the 1GB laptop, DX says it is using 896Mb.
On my 2Gb, it says it is using 164Mb. (!!)
In the last case, that's probably not enough for most stuff I do, so
I disconnect and restart by upping memory in the Connect to Server
Options dialog. Oddly, the largest number I am allowed to enter is
1256Mb. 1257 or higher and the server declares is is "queued" but
does not start. No idea where that magic number comes from. I will
point out that it is not 7/8 of 2Gb.
Anyway, after starting with 448/896/1256:
We open the net and run it while monitoring dxexec process in Task Manager.
The net attempts to Import a series. The test data has about 1800
members (the REAL data has 5000, but I'm working with a subset for
prototyping). Each member is a 1D line of 148 scalar data items,
thus, small simple fields. (For your interest, it is airborne LIDAR
used for bathymetry; this is a Cornell research job, not a private
consulting thing).
In all machines, dxexec starts out at about 6Mb of memory (as
reported by Task Manager), then as the Import begins, it jumps to
about 7, about 8, about 10, then WHAM. Dies. On my machine, I can
monitor the files as Message Window prints and sometimes it gets
quite far along. Sometimes, if I reboot, I can even load the whole
thing, but then when the macro that processes the series (clips off
the topmost point and makes a new series of those points), it
invariably dies.
So, we cut the series down to 500 files. Same behavior.
100 members: same.
10: works on the 1Gb laptop, on my machine dies.
5: yay! the series imports, the images are rendered, the macro runs
to completion. Pretty damn worthless!
This appears to relate to the problem I mentioned a few weeks ago,
when I was trying to Print a series and it kept killing the server
(it was probably the same project). I was never able to debug,
because Print would fail and take out dxexec in the process. I could
Select a small subset from the series and Print, but I never had this
sort of problem back in SGI days.
So, I think I'm ready to declare that there is some horrible memory
allocation problem regarding series files on Windows.
I know you have a pile of things going, so if you ask, I can send you
the zip file with the net/macro/data to test on your end. Otherwise,
if you get a chance, take a look around in the code and see if
anything appears weird.
I should add that we have run other projects on the 2Gb desktop and
the 1Gb laptop here in which dxexec happily crunches away and gets up
to at least 200Mb on each machine, so it's not an inherently bad
dxexec or something (I'm sure we could use more memory but that's all
the other job requires). It's just that those other projects don't
involve series. There may be some other common denominator that I'm
missing, but at the moment, the fact that a series doesn't even load
makes me suspicious.
It's almost like the malloc is asking for too much too fast. I can't
explain it any other way.
I also would really like to know why starting DX on my 2Gb machine
appears to generate totally incorrect memory requests. The 164Mb
mentioned above is what it says today, but last week it was 28Mb
sometimes, 32 others: essentially random but never very high. Is it
possible the memory requester is rolling over at 2GB and thinking my
machine has only a small memory?
Thanks for any thoughts, as always. Repost this to the dev list if
you like as I can't at the present time. In particular, Greg might be
interested.
_______________________________
Chris Pelkie
Scientific Visualization Producer
618 Rhodes Hall, Cornell Theory Center
Ithaca, NY 14853
--
.............................................................................
David L. Thompson Visualization and Imagery Solutions, Inc.
mailto:[EMAIL PROTECTED] 5515 Skyway Drive, Missoula, MT 59804
Phone : (406)257-8530