Why don't you just use wget in a loop to time the entire rails
application?  Something like this will hit your server with 1000
requests and tell you how long it took.

$ time for (( cnt = 0 ; cnt < 1000 ; cnt++ )) ; do /usr/bin/wget -nd -
q -O /dev/null http://localhost:3000/whateverpage ; done

Just remember to run in Production mode if you want any sort of
realism.

Brendon.

On Apr 22, 2:09 pm, SpringFlowers AutumnMoon <rails-mailing-
l...@andreas-s.net> wrote:
> Frederick Cheung wrote:
> > On 22 Apr 2009, at 21:39, SpringFlowers AutumnMoon wrote:
>
> >> how is my test flawed?
>
> > Exactly as I explained previously: you are benchmarking how long it
> > takes to render an erb template, but there's a lot of other bits of
> > overhead that will go into your overall requests/second.
>
> > Fred
>
> and that's why i said it is just a ball park figure (just for the
> templating part).  for example, if my webpages are dynamic and with very
> little processing, then maybe it can serve 1800 pages per second?  (i
> got to try when i get home).   also, maybe i can remove any code at all
> and just put some static content there and see how long it takes to
> serve that content.
>
> --
> Posted viahttp://www.ruby-forum.com/.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To post to this group, send email to rubyonrails-talk@googlegroups.com
To unsubscribe from this group, send email to 
rubyonrails-talk+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/rubyonrails-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to