Assumming that the clients will NOT change the original database, I can
think of 2 methods:
1. Whole DB: drop all indices, then vacuum, then zip, then propagate. On
the clients (suscribers), unzip, re-create all indices.
2. Incremental, add a field to stamp the date on all your records, then
export to a separate database (zip it). On clients, apply the changes.
500k records don't say much - how big (in MB) is your DB? I can also
suggesst changing just the FTP site to another ISP who has unlimited (or
a very large GB monthly limit).
jp.
Juan Perez wrote:
> (excuse me for the other mail, i clicked accidentally the button...)
>
> Hi:
>
> I think that i explained me bad.
> In my work i do next until now:
>
> Phase 1: I generate automatically a database from a CRM
> Phase 2: I put the database in the FTP for the commercials of the
> enterprise. They are located in different parts of the country.
> Phase 2: The commercials uses my applicaction with the database to
work.
>
> The problem is that now, the datasase is too big (and we pay the FTP
> to an ISP for the used size and the consumed wide of band). So, i now
> want to change the process to:
>
> Phase 1: I generate automatically a database from a CRM
> Phase 2: As I already have the previous database, i will generate a
> diff file in the format i explained in the previous mail.
> Phase 3: I put the little diff file in the FTP ant the commercials
> downloades it..
> Phase 4: The commercials brings up to date the database using my
> application (it needs to be changed to do it).
> Phase 5: The commercials can use my applicaction with the new database
> to work.
>
> So, the question is ¿how to do, in the best way, the new phase 2?
>
> 2006/9/4, Juan Perez <[EMAIL PROTECTED]>:
>> Hi:
>>
>> I think that i explained me bad.
>> In my work i do next work:
>>
>> I generate automatically a database from a CRM
>>
>> 2006/9/4, Paul Smith <[EMAIL PROTECTED]>:
>> > At 16:48 04/09/2006, you wrote:
>> > >Hi all:
>> > >
>> > > I have developed a program that uses a sqlite database.
>> > > Until now the users downloaded an entire new version of the
>> > >database weekly from the FTP server.
>> > > But now the database is too big (about 500.000 records) and i
want
>> > >to make a database actualization system.
>> > > So, what is the best way (having the old database and the new
one)
>> > >to obtain a file with the differences. Something like this:
>> >
>> > Hmm, I don't think I'd do it that way. If you do that, then you need
>> > to have a copy of the old & new database to compare.
>> >
>> > One way around it is to have a 'journal' table which just contains
>> > all the SQL queries which have been actioned (you have to take care
>> > if you use transactions) along with an incrementing serial number.
>> > Then, the user's software can say 'I have all journal entries up to
>> > 252376', and then you can just given them all the journal entries
>> > after that number, and they can run the SQL on their end. which will
>> give.
>> >
>> > You can make your routine which modifies the database just keep a
>> > copy of the SQL used whenever the action succeeds, and store that in
>> > the Journal table.
>> >
>> >
>> >
>> > Paul VPOP3 - Internet Email
Server/Gateway
>> > [EMAIL PROTECTED] http://www.pscs.co.uk/
>> >
>> >
>> >
>> >
>>
-----------------------------------------------------------------------------
>>
>> > To unsubscribe, send email to [EMAIL PROTECTED]
>> >
>>
-----------------------------------------------------------------------------
>>
>> >
>> >
>>
>
>
-----------------------------------------------------------------------------
>
> To unsubscribe, send email to [EMAIL PROTECTED]
>
-----------------------------------------------------------------------------
>
>
>
>
-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------