php-general Digest 11 Sep 2011 00:16:35 -0000 Issue 7473

Topics (messages 314768 through 314771):

Re: PHP cron job optimization
        314768 by: Sean Greenslade
        314769 by: Gary Golden
        314770 by: MUAD SHIBANI
        314771 by: Eric Butera

Administrivia:

To subscribe to the digest, e-mail:
        php-general-digest-subscr...@lists.php.net

To unsubscribe from the digest, e-mail:
        php-general-digest-unsubscr...@lists.php.net

To post to the list, e-mail:
        php-gene...@lists.php.net


----------------------------------------------------------------------
--- Begin Message ---
On Sat, Sep 10, 2011 at 4:35 AM, muad shibani <muad.shib...@gmail.com>wrote:

> I want to design an application that reads news from RSS sources.
> I have about 1000 RSS feed to collect from.
>
> I also will use Cron jobs every 15 minutes to collect the data.
> the question is: Is there a clever way to collect all those feed items
> without exhausting the server
> any Ideas
> Thank you in advance
>

Do them one at a time. Fetching web pages isn't a particularly taxing job
for any decent server.

My advice is just to try it. Even if you solution isn't 'elegant' or
'clever,' if it works and doesn't bog down the server, that's a win in my
book.

-- 
--Zootboy

Sent from my PC.

--- End Message ---
--- Begin Message ---
> I want to design an application that reads news from RSS sources.
>> I have about 1000 RSS feed to collect from.
>>
>> I also will use Cron jobs every 15 minutes to collect the data.
>> the question is: Is there a clever way to collect all those feed items
>> without exhausting the server
>> any Ideas
>> Thank you in advance
Just watch your memory and you'll be fine.
As was stated fetching an rss feed is fast and cheap, so I would
contrariwise think about  paralleling if I would you.

Attachment: signature.asc
Description: OpenPGP digital signature


--- End Message ---
--- Begin Message ---
thanks a lot after I test it I will share the code .. Thanks again

On Sat, Sep 10, 2011 at 11:20 AM, Gary Golden <m...@garygolden.me> wrote:

> > I want to design an application that reads news from RSS sources.
> >> I have about 1000 RSS feed to collect from.
> >>
> >> I also will use Cron jobs every 15 minutes to collect the data.
> >> the question is: Is there a clever way to collect all those feed items
> >> without exhausting the server
> >> any Ideas
> >> Thank you in advance
> Just watch your memory and you'll be fine.
> As was stated fetching an rss feed is fast and cheap, so I would
> contrariwise think about  paralleling if I would you.
>
>


-- 
*
_______________

Muad Shibani*
*
*
Aden Yemen
Mobile: 00967 733045678
Mobile: 00967 711811232

www.muadshibani.com

--- End Message ---
--- Begin Message ---
On Sat, Sep 10, 2011 at 1:47 PM, Sean Greenslade <zootboys...@gmail.com> wrote:
> On Sat, Sep 10, 2011 at 4:35 AM, muad shibani <muad.shib...@gmail.com>wrote:
>
>> I want to design an application that reads news from RSS sources.
>> I have about 1000 RSS feed to collect from.
>>
>> I also will use Cron jobs every 15 minutes to collect the data.
>> the question is: Is there a clever way to collect all those feed items
>> without exhausting the server
>> any Ideas
>> Thank you in advance
>>
>
> Do them one at a time. Fetching web pages isn't a particularly taxing job
> for any decent server.
>
> My advice is just to try it. Even if you solution isn't 'elegant' or
> 'clever,' if it works and doesn't bog down the server, that's a win in my
> book.
>
> --
> --Zootboy
>
> Sent from my PC.
>

Traversing the internet is one of the most expensive IO operations
that can be performed.  The OP should try and split the work into
several processes and utilize async connections via a mechanism like
curl_multi [1].  Doing 1,000 connections one at a time is abysmal if
you're blocked for even 3 seconds per connection.

I'd work by making a two step process that has one cron job for
constantly fetching/updating feeds into a store & then another process
that constantly checks for updates inside aforementioned store.

[1] http://us3.php.net/manual/en/function.curl-multi-add-handle.php

--- End Message ---

Reply via email to