Hey Cam,

About a year ago, ActiveRecord::Batches was introduced to avoid that exact problem. You can use either find_each() or find_in_batches() to iterate over large collections without loading the entire collection of records into memory. Those methods default to 1000 record batch sizes, but can be configured using the :batch_size option.


Cheers,

Nathan



On 22/04/2010, at 11:31 AM, Cameron Barrie wrote:
One simple thing to ensure if that you're not doing a
Class.find(:all)

If you've got an even semi large dataset, you'll hose your memory footprint by loading up 1000's of ActiveRecord objects. I know, it's a mistake I've made once. User.find(:all) in a Facebook app, was a stupid thing for me to do, and I've never done it again, or anything like it...
You should limit those finds to a certain number.

There is tons of other things to look for, but this is a simple starting point.

-Cam


On 22/04/2010, at 11:18 AM, Joshua Partogi wrote:
Hi all,


Currently I am running Rails application online and it is running dog
slow. The current memory that I have on the VPS right now is 360MB,
which I thought was more than enough. How much would you recommend,
based on your experience, I should allocate for a Rails apps? Is there
anything else I should look out for to make the application perform
much better? I know that this is quite an open ended question, but as
I am still new with this, any suggestion is most welcome.


Thank you very much for the insights all. Really appreciate it.


Kind regards,
Joshua

--
http://twitter.com/scrum8

--
You received this message because you are subscribed to the Google Groups "Ruby or 
Rails Oceania" group.
To post to this group, send email to rails-ocea...@googlegroups.com.
To unsubscribe from this group, send email to 
rails-oceania+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/rails-oceania?hl=en.

Reply via email to