Agreed, not sure my suggestion will help you. You can have a look at
turning off transaction, that might make your code faster but I'm
really not sure.
( Check-out those
from django.db import transaction
@transaction.commit_manually
)
- Benjamin
On Thu, Aug 20, 2009 at 9:58 PM, Doug
On Thu, Aug 20, 2009 at 10:40 PM, Benjamin Sergeant wrote:
>
> (the answer was already sent, raw SQL, anyway)
>
> (with postgresql) I would try to load just a small bit of datas with
> the slow create django object / save it process, then do a pg sql
> dump, and look at the
(small update, if you try what I just suggested and use postgres (lots
of if) you want to generate SQL and load with psql and not pg_restore)
- Benjamin
On Thu, Aug 20, 2009 at 7:40 PM, Benjamin Sergeant wrote:
> (the answer was already sent, raw SQL, anyway)
>
> (with
(the answer was already sent, raw SQL, anyway)
(with postgresql) I would try to load just a small bit of datas with
the slow create django object / save it process, then do a pg sql
dump, and look at the sql that pg dump generated. Then code a python
script that generate that.
And then do a pg
On Aug 20, 2:50 pm, Alex Gaynor wrote:
> On Thu, Aug 20, 2009 at 1:46 PM, Abiel wrote:
>
> > Is there an efficient way to use Django models to load a large number
> > of records into a database without falling back on raw SQL? Creating a
> > large number
Thanks for the feedback, I will go ahead and use raw SQL. Hopefully
bulk inserts will be a feature in a future version of Django,
especially given the rising importance of sites oriented towards large
volumes of raw data.
Abiel
On Aug 20, 2:50 pm, Alex Gaynor wrote:
> On
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Abiel declared:
> Is there an efficient way to use Django models to load a large number
> of records into a database without falling back on raw SQL? Creating a
> large number of model objects and then saving each one individually is
> very slow (I
On Thu, Aug 20, 2009 at 1:46 PM, Abiel wrote:
>
> Is there an efficient way to use Django models to load a large number
> of records into a database without falling back on raw SQL? Creating a
> large number of model objects and then saving each one individually is
> very slow
Is there an efficient way to use Django models to load a large number
of records into a database without falling back on raw SQL? Creating a
large number of model objects and then saving each one individually is
very slow (I imagine Django is running INSERT and COMMIT each time?).
Thanks very
9 matches
Mail list logo