I'd just wrap the whole thing in a database transaction. Then commit your
ObjectContexts as often as you want to, but the real DB commit won't happen
until the end.
TransactionManager transactionManager = CayenneRuntime.*getThreadInjector*
().getInstance(TransactionManager.*class*);
transactionManager.performInTransaction(*new*
TransactionalOperation<Void>() {
@Override
*public* Void perform() {
*return* *null*;
}
});
On Thu, Sep 27, 2018 at 2:36 PM Tony Giaccone <[email protected]> wrote:
> I'm processing a large number of rows, over 600,000 and the key value
> should be unique in this file but I'd like to ensure that. I also want this
> to happen with some rapidity. To speed this process upI'm going to read
> lines from the file, create objects and commit the changes after 500 have
> been created.
>
> The problem with this is that if I have a duplicate value I won't catch it
> till I do the commit.
>
> When I insert a second key value the first exception is a db level :
> org.postgresql.util.PSQLException
>
> eventually this gets wrapped by a Cayenne Commit error.
>
> So I'd like to get a sense of what folks think. Given that I want to
> balance these conflicting goals of speed and accuracy.
>
> Can I easily figure out what object or objects caused the error and can I
> exclude them from the context and redo the commit? f
>
> Is this a reasonable path to follow.
>
>
>
> Tony Giaccone
>