It may be more to do with the difference between `for` and `map`. How do these 
versions compare in your benchmark:

(defn read-to-maps-partial [rows]
  (let [headers (->>
                  rows
                  first
                  (take-while (complement #{""}))
                  (map keyword))]
    (map (partial zipmap headers) (rest rows))))

(defn read-to-maps-fn [rows]
  (let [headers (->>
                  rows
                  first
                  (take-while (complement #{""}))
                  (map keyword))
        mapper (fn [row] (zipmap headers row))]
    (map mapper (rest rows))))

Sean

On Oct 10, 2014, at 11:42 AM, Michael Blume <blume.m...@gmail.com> wrote:
> So I'm reading a bunch of rows from a huge csv file and marshalling those 
> rows into maps using the first row as keys. I wrote the function two ways: 
> https://gist.github.com/MichaelBlume/c67d22df0ff9c225d956 and the version 
> with eval is twice as fast and I'm kind of curious about why. Presumably the 
> eval'd function still implicitly contains a list of keys, it's still 
> implicitly treating each row as a seq and walking it, so I'm wondering what 
> the seq-destructuring and the map literal are doing under the hood that's 
> faster.

Attachment: signature.asc
Description: Message signed with OpenPGP using GPGMail

Reply via email to