On 9/20/05, Robby Russell <[EMAIL PROTECTED]> wrote:
> [...]
> I started running the feeder on the dev site and noticed that as it
> finished sleeping for 300 seconds, it would appear to delete the entries
> in the db that were already there and then redownload them again. Is
> this the way you intended it to work?

Yeah, the feeder is pretty a pretty bad hack -- rather than muck about
with anything complicated to determine whether the entries in a feed
were new or old, it was easier to just trash everything and refresh it
on each fetch.

I'll probably spend a little time working on a better algorithm, if
for no other reason than to avoid churning through row ids so quickly.
In the meantime, it'd probably be safe to increase the loop delay to a
much longer value, like 30-60 minutes.

> It also doesn't seem to be getting the modified date from the RSS feeds
> right, so modified is always set to Time.now.

I'll check out the SimpleRSS docs, and see if I can get things working
a bit more reliably. If I push a better feeder script, will it get
pushed to the dev site?

> I also re-added the feeder file with chmod +x script/feeder so that it
> will be executable by default. :-) (couldn't figure out how to do it
> otherwise)

That work; I had just been running it with the ruby bin as the first argument.

> -Robby

Thanks for the feedback,

Lennon
_______________________________________________
PdxRuby-dev mailing list
[email protected]
http://lists.pdxruby.org/mailman/listinfo/pdxruby-dev

Reply via email to