I'm implementing a value encoder similar to gob. While bench-marking my 
code against gob, I noticed some unexpected memory allocations with a 
particular type of data.

See this minimal code example in the go playground 
https://go.dev/play/p/4rm-kCtD274

There is a simple function foo() receiving a []uint reflect.Value as input. 
It calls the Interface() method to get back a []uint value that it can then 
use. 

When the input value is a []uint, the Interface() method doesn't allocate. 

When the input value is a []uint of a [][]uint (a matrix), the Interface() 
method allocates the slice header on the heap. 

I was wondering if this is just a limitation of optimization or if there 
was a constrain imposing the allocation.

In my code and gob's code (see the helpers functions in enc_helpers.go) the 
slice returned by Interface() is only used locally. 


-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/golang-nuts/526f41b9-c4dc-41ea-a86e-8f665c5b18b0n%40googlegroups.com.

Reply via email to