Putting it in cron doesn't guarantee it will always find something to
delete. It just means you now have to maintain a cron entry external
to your actual app.
That's true. So just use the rufus-scheduler? ;-)
--
Posted via http://www.ruby-forum.com/.
On 20 Sep 2009, at 18:35, James Englert wrote:
I feel like I'm missing a major point here. Assuming the table is
correctly range partitioned and indexed, most databases should be
able to handle relatively large table sizes. I agree that is a best
practice to archive old, unused data,
On Fri, Sep 18, 2009 at 4:45 PM, Peter De Berdt
peter.de.be...@pandora.be wrote:
No it won't. It has randomization code that causes it to not run
_most of the time_. This is exactly how session gc should be handled.
It will ramp up proportionally with traffic.
Actually that could be never
yeah, my main point was that the method would be run for every
request. Probably not that many milliseconds in the grand scheme of
things, but why add any extra processing to your requests when you can
externalize it?
On Sep 21, 9:16 am, Greg Donald gdon...@gmail.com wrote:
On Fri, Sep 18, 2009
On Mon, Sep 21, 2009 at 2:26 PM, sax s...@livinginthepast.org wrote:
yeah, my main point was that the method would be run for every
request.
Just like your before_filter for user authentication.
Probably not that many milliseconds in the grand scheme of
things, but why add any extra
Thx for the input guys, yea Im also not sure why I would want to run
the session removal on a random basis, it seems like using script/
runner is the way Im gonna go. I can't put it into the authorization,
fwiw because the application doesn't have any authorization layer. Im
gonna look into the
2009/9/18 Peter De Berdt peter.de.be...@pandora.be:
On 18 Sep 2009, at 21:01, Greg Donald wrote:
That will be processed for every request,
No it won't. It has randomization code that causes it to not run
_most of the time_. This is exactly how session gc should be handled.
It will ramp
On 20 Sep 2009, at 17:09, Colin Law wrote:
Actually that could be never or always, relying on random numbers
to make
decisions on whether to do something most of the time is a bad
idea.
Since quantum physics works entirely by probabilities (that is random
numbers) and microprocessors
I feel like I'm missing a major point here. Assuming the table is correctly
range partitioned and indexed, most databases should be able to handle
relatively large table sizes. I agree that is a best practice to archive
old, unused data, but that can likely be done on a monthly basis, or less
On Thu, Sep 17, 2009 at 7:46 PM, TRBNGR david.scott.or...@gmail.com wrote:
Ok, I tried searching for the answer to this but couldn't find a
thread about it, so here goes...
I'm using sessions that I am storing in the database, is there an
accepted rails way of deleting out the old sessions?
That will be processed for every request, which isn't really
necessary. It probably won't add THAT much overhead, but if you have a
high volume site you'd want to offload the session clearing into
something else.
* you could make a rake task that deletes sessions older than a
certain offset, and
On Fri, Sep 18, 2009 at 11:01 AM, sax s...@livinginthepast.org wrote:
That will be processed for every request,
No it won't. It has randomization code that causes it to not run
_most of the time_. This is exactly how session gc should be handled.
It will ramp up proportionally with traffic.
On 18 Sep 2009, at 21:01, Greg Donald wrote:
That will be processed for every request,
No it won't. It has randomization code that causes it to not run
_most of the time_. This is exactly how session gc should be handled.
It will ramp up proportionally with traffic.
Actually that could
13 matches
Mail list logo