In my humble opinion.  If this is something that can be farmed out to
cron, I wouldnt' even involve Django/Satchmo at all.  Write an
external python script (or some other language if you are more
efficient) and do it directly.

Ryan

On Mon, Dec 13, 2010 at 4:59 AM, Jeff Cook <[email protected]> wrote:
> On Sun, Dec 12, 2010 at 11:04 PM, Jeff Cook <[email protected]> 
> wrote:
>> 8000 seems like not very much to make it choke. Is there some
>> low-hanging fruit for optimization there?
>
> I looked into this some and it is seems that even if I modify views.py
> to get the QuerySet with select_related(), which recurses down and
> fetches all marked relationships, things still insist on making
> individual queries. I presume this is because the template is calling
> functions on the model like product.get_category(), which performs a
> database query, and several others which do the same thing, when
> theoretically, assuming the fk relationships are correct and that
> select_related() works correctly, all of this information should
> already be attached to the product from the select_related() call.
>
> Is there a way to update these functions so that they don't force a
> query all the time? It is really absurd that things can be this slow
> for just 8000 products, Postgres and Python should split through those
> like butter, taking maybe 10 seconds max. I can't set a timeout high
> enough for the task to finish as it stands.
>
> The database side of things really needs some help. I might end up
> just rewriting this functionality to use hard queries so that the
> Google Base feed is usable for me. It is silly that it has to go
> through the template (which I'm sure is slow anyway) and then ask the
> database for something several times on each iteration, and then wait
> for a response, and then put that in the template, and so on, even
> after it's fed a large queryset consisting of all active products. At
> that rate it probably takes .5 sec for each item, just waiting for
> roundtrips to and from the database, which is ridiculous.
>
> One large query can get it all together in less than one second in my
> case and then it's just a matter of Python processing it. Even if the
> query took a long time to complete, it should be possible to save its
> results somewhere, make a cronjob, and then use that to generate this
> feed.
>
> --
> You received this message because you are subscribed to the Google Groups 
> "Satchmo users" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to 
> [email protected].
> For more options, visit this group at 
> http://groups.google.com/group/satchmo-users?hl=en.
>
>



-- 
http://www.sudovi.com/
http://www.twitter.com/lifewithryan
http://www.thecommontongue.com
http://www.lifewithryan.com/

-- 
You received this message because you are subscribed to the Google Groups 
"Satchmo users" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/satchmo-users?hl=en.

Reply via email to