ct" (which allows to access the appropriate table) and
"book_id" (which allows to access the table specified in the book
"book_subject").
Is a good solution to avoid to query a table with 4 milions of rows ?
Is there an alternative solution ?
Thanks ^__^
--
Carl Zmo
Hi Nameless,
On Jan 12, 9:20 pm, nameless wrote:
> The table is queried from ajax using an autocomplete field with this
> query in the views.py:
>
> books.objects.filter(book_title__istartswith=request.GET['q'])[:100]
This should not be a problem in MySQL either if you are using an index
for the
DB ORDER
> > BY) it will slow you down considerably. If you're selecting one row from the
> > database by an indexed column, then the performance will be very fast and
> > there's no need to prematurely optimize.
> >
> > On Tue, Jan 12, 2010 at 2:25 PM,
slow you down considerably. If you're selecting one row from the
> > database by an indexed column, then the performance will be very fast and
> > there's no need to prematurely optimize.
>
> > On Tue, Jan 12, 2010 at 2:25 PM, nameless wrote:
> > > My table with 4
; database by an indexed column, then the performance will be very fast and
> > there's no need to prematurely optimize.
> >
> > On Tue, Jan 12, 2010 at 2:25 PM, nameless wrote:
> > > My table with 4 milions of rows is queried often by ajax.
> > > So I think a
rematurely optimize.
>
> On Tue, Jan 12, 2010 at 2:25 PM, nameless wrote:
> > My table with 4 milions of rows is queried often by ajax.
> > So I think a performance problems ( I am using also index ).
> > Ok now take a look at the contenttypes :)
>
> >
ll be very fast and
there's no need to prematurely optimize.
On Tue, Jan 12, 2010 at 2:25 PM, nameless wrote:
> My table with 4 milions of rows is queried often by ajax.
> So I think a performance problems ( I am using also index ).
> Ok now take a look at the contenttypes :)
>
>
On Jan 12, 2010, at 8:25 PM, nameless wrote:
> My table with 4 milions of rows is queried often by ajax.
> So I think a performance problems ( I am using also index ).
> Ok now take a look at the contenttypes :)
I don't know what overhead django will put on the queries but
Cache the content of the table, maybe using memcached,and only go to the DB
if needed.
Victor Lima
2010/1/12 nameless
> My table with 4 milions of rows is queried often by ajax.
> So I think a performance problems ( I am using also index ).
> Ok now take a look at the con
My table with 4 milions of rows is queried often by ajax.
So I think a performance problems ( I am using also index ).
Ok now take a look at the contenttypes :)
On Jan 12, 8:15 pm, Tim wrote:
> As far as needing to split up the table into other tables, I'm not
> sure.
As far as needing to split up the table into other tables, I'm not
sure. Whats wrong with having 4 million records in one table?
But if you're going to do it, take a look at the contenttypes
framework and generic relations. It basically does what you're
describing:
http://docs.djangoproject.com/e
Hi at all. I have a project with 2 applications ( books and reader ).
Books application has a table with 4 milions of rows with this fields:
book_title = models.CharField(max_length=40)
book_description = models.CharField(max_length=400)
To avoid to query the database with 4 milions of rows
12 matches
Mail list logo