yes, we have auto-increment on that table. how do i invalidate the caches? i 
know there is a invalidateObjects method, but is this the way to do it?

-----Ursprüngliche Nachricht-----
Von: Michael Gentry [mailto:[EMAIL PROTECTED] 
Gesendet: Dienstag, 5. Dezember 2006 16:30
An: [email protected]
Betreff: Re: AW: updating large number of data

You know, sometimes simple and fast is a good way to do things.  Do
you have an auto-increment PK in that table? Would be helpful.  As for
Cayenne, can you can flush (invalidate) any active DataContexts (at
least the objects for that table) when the load occurs?

/dev/mrg


On 12/5/06, Peter Schröder <[EMAIL PROTECTED]> wrote:
> we are deleting all rows with truncate table first. then loading cvs with 
> load data infile.
>
> i would prefer not to use this method, but it is simple and fast.
>
> -----Ursprüngliche Nachricht-----
> Von: Michael Gentry [mailto:[EMAIL PROTECTED]
> Gesendet: Dienstag, 5. Dezember 2006 14:38
> An: [email protected]
> Betreff: Re: updating large number of data
>
> Are you deleting all of the original data and then doing inserts or
> are you doing updates?
>
> Thanks,
>
> /dev/mrg
>
>
> On 12/5/06, Peter Schröder <[EMAIL PROTECTED]> wrote:
> > hi,
> >
> > we get a cvs-file with a large number of user-data every hour. and we want 
> > to replace the existing data in our database with that. is there a 
> > best-practice to do something like this?
> > currently we are doing that with php an using mysql load data infile with 
> > the cvs-file. but i think that doing this with cayenne would leave the 
> > context in a bad state.
> >
> > hava a nice day,
> > peter
> >
>

Reply via email to