update - the whole time i was researching this I kept thinking a "key
only query" was when you used the low level api to query for an object
using the key. Confusion on my part that what everyone meant was to
write a query that only requested the id of the objects, and that was
much faster/cheaper. Here is the final version of my delete code that
does a key only query and cycles through 1000 results at a time

Thanks!


d.add(Calendar.DATE,(expDays * -1));
                        Query q = pm.newQuery("select id from " +
RecordedValue.class.getName());
                        q.setFilter("pointFK== k && timestamp < d");
                        q.declareImports("import java.util.Date");
                        q.setRange(0,1000);
                        q.setOrdering("timestamp ascending");
                        Map<String, Object> args = new HashMap<String, 
Object>();
                        args.put("k",pointId);
                        args.put("d", d.getTime());

                        List<RecordedValue> v = (List<RecordedValue>)
q.executeWithMap(args);
                        long count = v.size();
                        if (count > 0)
                        {
                                pm.deletePersistentAll(v);
                                
DataServiceImpl.startDeleteDataTask(pointId,true,expDays);
                        }

On Feb 13, 5:23 am, Benjamin <bsaut...@gmail.com> wrote:
> Thanks Peter - i think i'm forced to use a query like the one above
> because the only criteria i have for deleting these objects is a
> timestamp and the foreign key. So I would have to make a seperate
> query to get the object's keys. Please let me know if i'm missing some
> trick here or if it makes sense to do this another way
>
> It looks like the only solution that isn't causing errors is to delete
> 1000 at a time and then restart the delete task until the result is
> zero. This is all very expensive though, it looks like it costs about
> $2.00 to delete a couple hundred thousand records this way.
>
> The big question is, is it the deletePersistentAll that's costing the
> cpu time? Should i get the query and iterate through the result and
> use the low level api to delete each key?
>
> Here is what i currently have that is costing a lot of $ to purge a
> couple 100K of records from the database
>
>                         Query q = pm.newQuery(RecordedValue.class,"pointFK== 
> k && timestamp
> < d");
>                         q.declareImports("import java.util.Date");
>                         q.setRange(0,1000);
>                         q.setOrdering("timestamp ascending");
>                         Map<String, Object> args = new HashMap<String, 
> Object>();
>                         args.put("k",pointId);
>                         args.put("d", d.getTime());
>
>                         List<RecordedValue> v = (List<RecordedValue>)
> q.executeWithMap(args);
>                         long count = v.size();
>                         if (count > 0)
>                         {
>                                 pm.deletePersistentAll(v);
>                                 
> DataServiceImpl.startDeleteDataTask(pointId,true,expDays);
>                         }
>
> On Feb 12, 1:57 pm, Peter Liu <tinyee...@gmail.com> wrote:
>
>
>
>
>
>
>
> > Try using the low level API and do keys only query with limit of 1000
> > (and delete them) repeatedly instead of retrieving whole objects.
>
> > I am guessing the out of memory is due to large amount of objects
> > returned by the query. Keys only query also use much less api cpu.
>
> > On Feb 12, 9:22 pm, Benjamin <bsaut...@gmail.com> wrote:
>
> > > I'm getting errors when a task kicks off to delete a lot of data based
> > > on a timestamp. I enabled billing and already chewed through $0.50 in
> > > CPU time, but i'm still getting the error message. Is there anything
> > > else I should do? I was trying to avoid splitting the task up with a
> > > result limit or something, i really just need to blow away persisted
> > > objects that have a timestamp older than a specified date - this
> > > snippet of code causes the error:
>
> > > PersistenceManager pm = PMF.get().getPersistenceManager();
> > >                 Calendar d = Calendar.getInstance();
> > >                 long retVal = 0;
> > >                 if (expDays > 0)
> > >                 {
> > >                         d.add(Calendar.DATE,(expDays * -1));
> > >                         Query q = 
> > > pm.newQuery(RecordedValue.class,"pointFK== k && timestamp
> > > < d");
> > >                         q.declareImports("import java.util.Date");
> > >                         Map<String, Object> args = new HashMap<String, 
> > > Object>();
> > >                         args.put("k",pointId);
> > >                         args.put("d", d.getTime());
> > >                         retVal = q.deletePersistentAll(args);
>
> > >                 }
> > >                 pm.close();
> > >                 return retVal;

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine for Java" group.
To post to this group, send email to google-appengine-java@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.

Reply via email to