Bruno,
It is difficult to advise based on the information provided.  Not sure
exactly what you are concerned with, postgres database size? query response
time? network transfer time? All of the above?  Each of these impacts can be
dealt with differently.  Perhaps if you provided some detail on exactly what
your concerns are, you may get better recommendations.  I am not a database
expert, and I did not sleep in a Holiday Inn express last night, but I have
done a bit of ajax-enabled web development and async calls can be really
helpful in getting large amounts of data from the server to the client in
small efficient chunks, if the network transfer issue is high on your list.
-richard


On 9/25/07, Bruno Tikami <[EMAIL PROTECTED]> wrote:
>
> Hi fellows,
>
> I'm developing a Django project that's going to handle with big sets of
> data and want you to advise me. I have 10 internal bureaus and each of then
> has a 1.5 million registers database and it really looks to keep growwing
> on size on and on.  I intend to use Postgres.
>
> The question:  what's the best way to handle and store this data? I tought
> about breaking the app model into 10 smaller ones (Bureau_1, Bureau_2,
> Bureau_3 etc) cause the main reports are splited by Bureau. Response time
> matters. What do you think?
>
> Thanks!!
>
> Tkm
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to