H. S. Teoh wrote:
The problem is that quite often, the compiler will not be able to prove
much about the data structure, esp. when you have non-trivial
manipulation of pointers -- it requires solving the halting problem to achieve that level of analysis on arbitrary code -- so it will probably fall back to the GC quite often, depending on what the code does.

Well I misspoke on exactly what I meant by "falling back to ref-counting".. I actually don't think you'd need anything like a typical GC's or Ref-counting system, since this memory-model is so rigid it gives a bit more guarantee on when something is allocated and deleted, therefor when references need to be cleaned can be a bit more optimized (at least I think).

For any local variables, most of the cleanup code can be statically injected (like I illustrated above). Really the only area you need to dynamically track and "nullify" references is for dynamically allocating objects, like DynamicArrays. For them we can fall-back on a sort of "Garbage Collection", however I think it can be better (doesn't need cyclic detection, is deterministic, etc). Let me illustrate...

In the compiler, our DynamicArray type might look like:

    type DynamicArray(T) # template type
    {
        ptr pointer : T
        var length : type(ptr)

        ptr refs = MemoryPool(ref T[])

        func => (index, r:ref T)
        {
            refs[index] += r
        }

        func -= (item)
        {
            ...remove item...

            var rfs = refs[getIndex(item)]

            each r in rfs
            {
                if r ==> item
                {
                    r => null
                }
            }

            refs -= rfs
        }
    }

Basically, syntax like:

    var foos = Foo[]

is just sugar for:

    var foos = DynamicArray(Foo)


You'll notice that the DynamicArray type has a reference 'refs' to a MemoryPool of type 'ref T[]'. This MemoryPool object would probably be shared across multiple/all DynamicArrays for optimization, but for now just think of it as a simple dynamic-array-of-arrays itself.

The functions: '=>' & '-=' are operators, and happen whenever a variable is referenced to an item, and when an item is removed, respectively. To illustrate:

    var nums = int[] # DynamicArray(int)
    ref a, b : int

    func main
    {
        nums += 1 # allocate
        nums += 2 # allocate
        nums += 3 # allocate

        a => nums[2] # nums.=>(2, a)
        b => a       # nums.=>(2, b)

        each i in nums
        {
            if runtime_condition
            {
                nums -= i # nums.-=(i)
            }
        }
    }

So while this would search through a list of refs every time an item was removed, it wouldn't have to do any complex cyclic detection, or other such things. The MemoryPools could be a universally managed heap of references that each DynamicArray only held a slice-reference to, therefor allocating arrays would still be cheap.

That's more of what I meant by falling back to "Ref-counting" (more like reverse-referencing) type systems in cases where you need them. Combined with all the optimizations that could occur with local variables, I think a system like this could have a lot of potential as an alternative to a thread-locking GC. Still... I have this feeling like there's some obvious flaw in the design I just can't see yet...

Reply via email to