> In my personal experience I cannot get within 10X the throughput, or
> latency, of mutable data models when using persistent data models.
>

Hi Martin,
Thanks for finding this thread :-). Let me ask a reversed question. Given
you come from a persistent data model where code remains reasonably simple.
How much effort it really takes to make an imperative model working well
with relatively low number of defects? How deep you go to make sure that
data structures fit CPU architecture in terms of topology as well as of
size of caches? And how it scales in terms of writing the code itself (I
mean are code alternations are easy straightforward or you have to write it
from scratch)?

I do not mean to troll, just sincere curiosity ...

Andy

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to