We will go for a TIM. Just looked at the doc and tes that would simplify our work a lot.
Thank you, Luc On Sat, 2009-02-28 at 18:48 -0800, Nabib El-Rahman wrote: > Hi guys, > > I work for Terracotta ( on the server side ) and find this work with > Clojure + Terracotta very exciting. Writing a TIM is definitely the > way to go, It's a place to hide the glue until both Terracotta and > Clojure catches up with each other. If you have any questions feel > free to post on our forums > http://forums.terracotta.org/forums/forums/list.page > > If you check out our trunk version, theres also an effort to make a > common-api which will help writing a TIM easier for you guys. > > Good luck! > > -Nabib > > > On Sat, Feb 28, 2009 at 12:02 PM, Luc Prefontane > <lprefonta...@softaddicts.ca> wrote: > > We think the same way. Our first implementation of an > alternative to AtomicReference > is straightforward, we will look at improving it if the need > arises. > > It will be easier to do so when we get stats from Terracotta > after running some benchmarks. > There's much to do before getting there. > > Luc > > > > > > On Sat, 2009-02-28 at 14:33 -0500, Paul Stadig wrote: > > > In the Namespace case, it might be premature optimization to worry > > about AtomicReference being replaced. If there is a way to rewrite > > that code with, say, synchronized blocks, and it will work better > with > > Terracotta, I think it would be worth doing. I don't think it would > be > > normal usage to be updating the mappings and aliases in a namespace > > 1,000 times a second. > > > > AtomicReference is also used in Atom and Agent. Those cases may not > be > > as straight forward. > > > > > > Paul > > > > On Sat, Feb 28, 2009 at 11:51 AM, Luc Prefontaine > > <lprefonta...@softaddicts.ca> wrote: > > > > > > 1) AtomicReference is used in several places. Instead of changing > it, we > > > think we can keep > > > it when Clojure runs "locally" and provide an alternative when > running in > > > "shared" mode. > > > > > > AtomicReference is optimized to be efficient in a standalone JVM. > We would > > > like to > > > keep it that way. Eventually Terracotta will provide > instrumentation on this > > > class > > > by default so the "shared" implementation could be thrown away in > the near > > > future. > > > We see the double implementations as a transition period until > Terracotta > > > supports > > > it directly. > > > > > > 2) Noted > > > > > > Shared versus local mode: > > > > > > That's what we have in mind, getting Clojure to work in a > "shared" mode > > > versus a > > > local/standalone mode. We want 0 impacts on the user code. > Eventually we > > > could use meta data to provide some hints that would allow us to > fine tune > > > shared interactions from user code. This would not impact "local" > mode > > > behaviours. > > > We're not there yet but we know that this possibility exists so > that's > > > reassuring > > > for the future. > > > > > > Integration is pretty simple once the common code base integrates > the > > > necessary > > > changes. We need a shell script, a Terracotta configuration that > will be > > > maintained > > > as part of the Clojure code base and some documentation. > > > > > > As of now we use a system property to toggle the modes, we will > implement a > > > transparent way (testing the presence of a terracotta property > most > > > probably). > > > > > > > > > Luc > > > > > > > > > -- > > Luc Préfontaine > > Off.:(514) 993-0320 > Fax.:(514) 993-0325 > > Armageddon was yesterday, today we have a real problem... > > > > > > > > > > -- Luc Préfontaine Off.:(514) 993-0320 Fax.:(514) 993-0325 Armageddon was yesterday, today we have a real problem... --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en -~----------~----~----~----~------~----~------~--~---