I'll be trying to narrow it down even more tomorrow, but I was hoping somone here might have some insight into this weird issue I am having.

I have a dynamic array of shorts that I had been trying to append to (capturing sound data). I kept getting a segfault when doing the append, and have narrowed it down to getting a segfault when the length of the array is equal to or greater than 1024.

It looks wonky because of my many tests, but this is the code that was causing the segfault:

override bool onProcessSamples(const(short)[] samples)
{
        import std.stdio;
        for(int i = 0; i<samples.length; ++i)
        {
        writeln(m_samples.length);
        m_samples.length +=1;
        }
        return true;
}

It will print all the numbers, endikng with 1023, and then it shows "Segmentation fault" and if I comment out the writeln line it just shows the Segmentation fault line. Similar code that would cause the length to be at 1024 or higher will also cause a segfault.

The class that this method belongs to is wraped up in a C++ interface and it gets called C++ side.

Also, if I set the length to 1024 or higher in the class constructor or if I create a different array elsewhere and set its length to 1024 or higher, I don't get the segfault anymore.

As far as I can tell, this only happens because it is called in C++ code. It doesn't appear to happen on my system at all if I do anything like this outside of my C++ interoperating. So far my fix is to set the length of the array to 1024, and then right away set it back to 0 in the class' constructor, but I'd like to get this to work without any strange hacks.

Reply via email to