Hey;

I'm developing a pipeline that takes as input XML files with a reasonable 
complex structure and size, the nested structure might have several layers 
and these files can go from 900 MB to 3 GB in size. I can't change that 
because it's not on my control how they are formed, I just need to read 
them.

Right now I have the pipeline working by mapping a nested group of structs 
to the XML file, something like this:

(code already works, this is just to server as an example on how I'm doing)
```
xmlFile, e := os.Open(f)
    if e != nil {
        return e
    }

    defer xmlFile.Close()

    b, _ := ioutil.ReadAll(xmlFile)

    var xmlHead XmlHeadStruct

    reader := bytes.NewReader(b)
    decoder := xml.NewDecoder(reader)
    decoder.CharsetReader = charset.NewReader

    if e = decoder.Decode(&xmlHead); e != nil {
        return e
    }
```

Now, everything works perfectly but with one exception, the memory usage! A 
file with approximately 900 MB is consuming 4 GB RAM and one with 3.3 GB 
can consume almost 14 GB RAM. Not cool!

Do you guys have any suggestions on how to circumvent this ?

Thanks

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to