Hi,
I've a got a very strange bug here, that appeared suddenly in code that was already working for some weeks (caustics, remember?) and now leaks quite some memory (about 150kb/sec) on the server side. Even more strange is, that not all servers are leaking but only some (2-8 of 10) of them (randomly distributed). Only node one, where the client and the server are running is never leaking. Another very disturbing fact is, that I don't see any increase of my application memory usage in top or ps: My server starts and stays at ~65% memory usage, while a watch free shows my a constant memory decrease. That's the reason while I first thought that this may be a kernel / driver problem.

This is definitely related to changing my images in the TextureChunk (the leak disappears if I comment out the setImage line), so I wonder if there is some undefined behaviour in here?

Some code:

I've got 32 Images that I load to an member array of my class in the first function:

void TexGenSceneModifier::load(){

   int size=imgSeqMax-imgSeqMin+1;
   _images=new ImagePtr[size];

   int j=0;
   for(int i=imgSeqMin;i<=imgSeqMax;i++){
       _images[j]=OSG::Image::create();
       beginEditCP(_images[j]);
           _images[j]->read(filename); //simplified for readability
       endEditCP(_images[j]);

       addRefCP(_images[j]);
       j++;
   }

   (...)

    //_txAdd is a TextureChunk
    beginEditCP(_txAdd);
       _txAdd->setImage(_images[_currentImage]); // _ curentImage=0 here
       _txAdd->setMinFilter(GL_LINEAR_MIPMAP_LINEAR);
       _txAdd->setMagFilter(GL_LINEAR);
       _txAdd->setEnvMode(GL_MODULATE);
       _txAdd->setWrapS(GL_REPEAT);
       _txAdd->setWrapT(GL_REPEAT);
endEditCP(_txAdd);
}

I was never really sure about that addRefCP() here, but it crashes after some (unpredictable 1minute-2hours) time if I don't use it - nevertheless, it doesn't leak memory while running.

Now I want to change this image per frame, so in my apply() (this is called in the display Func) I do the following:

const int TexGenSceneModifier::apply(double ttime,double etime){
   _frameCount++;
   _elapsedTime+=etime;
if(_frameCount>=_imgChangeInterval){ _frameCount=0;
       _elapsedTime=0;
       _currentImage=(_currentImage+1)%(_imgSeqMax-_imgSeqMin+1);

       beginEditCP(_txAdd,OSG::TextureChunk::ImageFieldMask);
           _txAdd->setImage(_images[_currentImage]);
       endEditCP(_txAdd,OSG::TextureChunk::ImageFieldMask);
  }
  (...)
   return 0;
}

I'm really sure that this worked for quite some time - why should it leak now?

System:
Debian (sid) 2.6.9 with Nvidia 6111 Drivers
Last OpenSG CVS checkout about 8 weeks ago.

This is definitely the last pending problem in my project, so please help me to get my (really needed) vacation.

Thanks & regards,
Dominik








-------------------------------------------------------
The SF.Net email is sponsored by: Beat the post-holiday blues
Get a FREE limited edition SourceForge.net t-shirt from ThinkGeek.
It's fun and FREE -- well, almost....http://www.thinkgeek.com/sfshirt
_______________________________________________
Opensg-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/opensg-users

Reply via email to