Yes, invalidateObjects() should do it.  The trickier part is knowing
when to do it and finding everything to invalidate.  A good portion of
that will depend on your application.  The basic steps, though, will
be:

* Get notified of table update
* Get a list of your DataContext objects (you might have to track
those yourself, etc)
* Iterate over each DataContext, and for each iterate over the objects
inside of it (dataContext.getObjectStore().getObjectIterator()) and
build up a collection of objects in the refreshed table, then call
invalidateObjects() on that collection

This might be a lot of overkill, though.  Depends on your application.
For example, can your application be shutdown while the refresh is
happening?  How much data is in those tables?  It sounds like they are
read-only and you might could just fetch everything again.

/dev/mrg


On 12/5/06, Peter Schröder <[EMAIL PROTECTED]> wrote:
yes, we have auto-increment on that table. how do i invalidate the caches? i 
know there is a invalidateObjects method, but is this the way to do it?

-----Ursprüngliche Nachricht-----
Von: Michael Gentry [mailto:[EMAIL PROTECTED]
Gesendet: Dienstag, 5. Dezember 2006 16:30
An: [email protected]
Betreff: Re: AW: updating large number of data

You know, sometimes simple and fast is a good way to do things.  Do
you have an auto-increment PK in that table? Would be helpful.  As for
Cayenne, can you can flush (invalidate) any active DataContexts (at
least the objects for that table) when the load occurs?

/dev/mrg


On 12/5/06, Peter Schröder <[EMAIL PROTECTED]> wrote:
> we are deleting all rows with truncate table first. then loading cvs with 
load data infile.
>
> i would prefer not to use this method, but it is simple and fast.
>
> -----Ursprüngliche Nachricht-----
> Von: Michael Gentry [mailto:[EMAIL PROTECTED]
> Gesendet: Dienstag, 5. Dezember 2006 14:38
> An: [email protected]
> Betreff: Re: updating large number of data
>
> Are you deleting all of the original data and then doing inserts or
> are you doing updates?
>
> Thanks,
>
> /dev/mrg
>
>
> On 12/5/06, Peter Schröder <[EMAIL PROTECTED]> wrote:
> > hi,
> >
> > we get a cvs-file with a large number of user-data every hour. and we want 
to replace the existing data in our database with that. is there a best-practice to 
do something like this?
> > currently we are doing that with php an using mysql load data infile with 
the cvs-file. but i think that doing this with cayenne would leave the context in a 
bad state.
> >
> > hava a nice day,
> > peter
> >
>

Reply via email to