Turns out that this has nothing to do with MySQL. After removing mysql from 
the equation I'm now simply using dbfpy to read DBF records from a 
blobstore object and cannot read 24K records before the 10 minute window 
boots me out of my task.

I'll repost my issue.

On Sunday, December 4, 2016 at 11:38:32 AM UTC-5, Mike Lucente wrote:
>
> I'm inserting 24K records into a table and seeing some major latency. I 
> was running the same process about a month ago (I get a new batch of 
> records every month) without a problem. Now the process craps out after 10 
> minutes with "Deadline exceeded". The same process runs without issue on my 
> laptop (local dev app engine). I would expect that the cloud instance would 
> perform better, not worse!
>
> While trying to diagnose I switched to inserting into an empty temp table 
> -- same problem. So inserting 24K rows into a clean table appears to be a 
> "no-go" for Cloud SQL.
>
> How do I get this resolved or is Cloud SQL just dog slow?
>
> BTW, my app engine and cloud sql instances are in the same region and I'm 
> connecting via unix socket.
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/78b11aaa-2f75-4615-8ed0-4fbfb53d715f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to