UTC-7, Richard Arrano wrote:
Hello,
I've been using webtest to unit test my application and I've encountered
a strange issue. I wrap many of my get/post handlers in decorators and in
some of those decorators, I put ndb.Model instances in os.environ for later
use in the subsequent handler
.py, line 1207, in http_open
return self.do_open(httplib.HTTPConnection, req)
File C:\Python27\lib\urllib2.py, line 1177, in do_open
raise URLError(err)
URLError: urlopen error [Errno 10104] getaddrinfo failed
On Wednesday, August 3, 2011 10:09:16 PM UTC-7, Richard Arrano wrote
Hello,
I've been using webtest to unit test my application and I've encountered a
strange issue. I wrap many of my get/post handlers in decorators and in
some of those decorators, I put ndb.Model instances in os.environ for later
use in the subsequent handler. This works on my local dev server
:59:25 PM UTC-7, Takashi Matsuo (Google)
wrote:
Can you show me the whole stacktrace?
What is your app-id?
On Wed, Aug 22, 2012 at 7:14 AM, Richard Arrano ricka...@gmail.comwrote:
Hello,
I have a function that takes a user image upload and saves both the blob
key and a serving URL
Hello,
I have a function that takes a user image upload and saves both the blob
key and a serving URL. After upgrading to 1.7.1, I now get an error trying
to save the updated entity to the datastore. The property pictures is an
ndb.StringProperty(repeated=True, indexed=False) and the property
Hello,
I have been designing my app with the notion in mind that even named
tasks may execute more than once, but I only recently came to realize
that a task may not execute at all. I have a task that operates on a
subset of my entities and it's absolutely imperative that all members
of this
I also have not seen tasks fail to run, but I found this thread:
http://stackoverflow.com/questions/5583813/google-app-engine-added-task-goes-missing
Specifically, the part that says: Tasks are not guaranteed to be executed
in the order they arrive, and they are not guaranteed to be executed
://www.myclasses.org/project
On Apr 24, 6:07 pm, Richard Arrano rickarr...@gmail.com wrote:
Thank you for the quick and very informative reply. I wasn't even
aware this was possible with NDB. How would those x.yref.get() calls
show up in AppStats? Or would they at all if it's just pulling it from
, April 23, 2012 10:21:26 PM UTC-7, Richard Arrano wrote:
I'm switching from db to ndb and I have a question regarding caching:
In the old db, I would have a class X that contains a reference to a
class Y. The Y type would be accessed most frequently and rarely
change. So when I would query
Hello,
I'm switching from db to ndb and I have a question regarding caching:
In the old db, I would have a class X that contains a reference to a
class Y. The Y type would be accessed most frequently and rarely
change. So when I would query an X and retrieve the Y type it points
to, I would store
, 2012 at 00:44, Richard Arrano rickarr...@gmail.com wrote:
Hello,
I'm having some trouble understanding the billing figures for when I
perform data writes. I had 1300 entities with 1 property indexed and I
kicked off a job via the Datastore Admin to delete them all. Given
that:
Entity
Hello,
I'm having some trouble understanding the billing figures for when I
perform data writes. I had 1300 entities with 1 property indexed and I
kicked off a job via the Datastore Admin to delete them all. Given
that:
Entity Delete (per entity) 2 Writes + 2 Writes per indexed property
Hello,
I just was wondering in the pricing model when it says:
Query 1 Read + 1 Read per entity returned
Suppose that it's a get_by_key_name type query and I supply keys that
do not exist, i.e. will return None type in Python, do those apply to
the 1 Read per entity returned or not?
Thanks,
Hello,
I've been uploading some data from the development to the production
server. This first upload was ~450 entities with 3 indexed properties
each. In one upload(batching at 10/post), I burned 60% of the free
quota of writes. Is this normal? If not, any idea on what I'm doing
wrong? My concern
Hello,
I've been attempting to work with the Python Imaging Library in my
app, and for the most part I can do so. In my app.yaml I have:
libraries:
- name: PIL
version: latest
However, I wanted to use ImageMath.eval and I found this error:
line 11, in module
import ImageMath
File
Hello,
I've been writing a game manager backend whose purpose is to check
every couple of seconds for games that are ready to be launched, call
a task to launch them, and rinse and repeat. I have a handler for a
front-end instance to URLFetch from the backend and retrieve its state
to display to
An addendum: I made a small mistake and it actually does output to the
logs, however, in the main loop of the form:
def check_current_games(self):
while True:
# do work
time.sleep(10)
it always hangs and the loop never performs a second iteration. I also
noticed in the
Hello,
Quick question regarding multithreading in Python 2.7:
I have some requests that call 2-3 functions that call the memcache in
each function. It would be possible but quite complicated to just use
get_multi, and I was wondering if I could simply put each function
into a thread and run the
to resort to 3
gets of varying size?
On Nov 22, 12:37 pm, Brian Quinlan bquin...@google.com wrote:
Hi Richard,
On Wed, Nov 23, 2011 at 7:18 AM, Richard Arrano rickarr...@gmail.com wrote:
Hello,
Quick question regarding multithreading in Python 2.7:
I have some requests that call 2-3 functions
that this is impossible?
Thanks,
Richard
On Nov 22, 1:56 pm, Brian Quinlan bquin...@google.com wrote:
On Wed, Nov 23, 2011 at 8:48 AM, Richard Arrano rickarr...@gmail.com wrote:
@Brandon:
This is true but it just would take a lot of rewriting that may or may
not be worth it.
@Brian
Thanks
I'll be leaving if some of the prices aren't tweaked, particularly the
channels. I was banking on being able to use a large amount of
channels, likely in the thousands per day. I did a double take when I
realized the new price was per hundred rather than per thousand,
particularly when channels
messaging (with channel API does not out of
the box) as well as per-user messaging. Also extremely affordable, 3 million
messages for $3.29.
-Martin
On Aug 31, 2011, at 10:33 PM, Richard Arrano wrote:
I'll be leaving if some of the prices aren't tweaked, particularly the
channels
As far as I can tell on the new billing page, it says 100 under Free
Quota for Channels Created and then a rate of $0.01 for every 100
more channels created. I could be misinterpreting it, but it seems
clear cut.
PubNub also looks like a great alternative to Channels, I'll have to
look at the two
~40,000 instances of is only indexed on
two properties. I'm using 30% of the 1.00 GB storage now. So I'm
wondering a) does metadata not include the storage for indexes and
b) does the discrepancy make sense given my situation?
Thanks,
Richard Arrano
--
You received this message because you
And for the record, I did the upload before the datastore statistics
had been refreshed, so those stats are indeed current.
-Richard
On Aug 25, 8:30 pm, Richard Arrano rickarr...@gmail.com wrote:
Hello,
I recently uploaded quite a bit of data using the bulkloader. I
noticed that my stored
Hello,
I'm attempting to upload some data in CSV format to the App Engine
servers. I filled out the first row specifying the property names,
including key. However, no matter what I do, including
import_transform: transform.create_foreign_key('UserAccount',
key_is_id=False), but it always has the
I haven't changed anything in ages, so if it used to be default then
yes. How can I change this?
-Richard
On Jul 30, 2:21 am, Tim Hoffman zutes...@gmail.com wrote:
Are you using sqlite backend. Maybe you aren't and your datastore size is
growing and the default datastore performs terribly
Thanks Tim, you nailed it. I attributed it to 1.5.2 but I realized I
created a moderate amount of data around the same time, hence the
slowdown. Using SQLite fixed it completely. Thanks!
-Richard
On Jul 31, 4:52 pm, Tim Hoffman zutes...@gmail.com wrote:
--usesqlite
If you datastore gradually
...@google.com wrote:
Is this the Python or Java dev server? Has anyone else experienced similar
issues?
--
Ikai Lan
Developer Programs Engineer, Google App Engine
plus.ikailan.com | twitter.com/ikai
On Fri, Jul 29, 2011 at 6:13 AM, Richard Arrano rickarr...@gmail.comwrote:
Hello,
I've noticed
Hello,
I've noticed that ever since I installed 1.5.2, the performance of my
development server has degraded terribly. It used to use ~250 MB of
memory and now, without any major changes to my application, it
consistently uses ~600-800 MB. Writing to the local datastore has now
become incredibly
Hello,
I'm working on a problem that at the moment seems to me to require an
expensive IN query which I'd like to avoid. Basically, each group of
users(and there may be thousands of such groups) draws product data
from a subset of currently 8 providers(though it could reach ~16). The
subset must
What I think he was referring to, and a problem I've encountered, is
working with Backends in the development server. The dev server will
not acknowledge the presence of Backends - the link is listed in the
admin console, but it won't list any backends you've configured in the
yaml file. Are these
D'alesandre gr...@google.com wrote:
On Thu, May 12, 2011 at 3:32 AM, Richard Arrano rickarr...@gmail.comwrote:
Hello,
I have a few questions about the new Backends feature:
It seems like a reserved backend allows us to have access to a sort of
unevictable memcache, is this correct? If I were
Hello,
I have a few questions about the new Backends feature:
It seems like a reserved backend allows us to have access to a sort of
unevictable memcache, is this correct? If I were doing something like
a game, would it be reasonable to have a single instance keep track of
a few pieces of vital
Hello,
I've been using the Channel API and for each chat room, I have a fixed
number of users. So for each room, prior to the launch, I call
create_channel for each user and store it in a dictionary that I save
in a TextProperty. When I want to broadcast, I read the TextProperty
and convert it
connection,
afterwards it's discarded.
BR // Fredrik
On May 12, 1:57 pm, Richard Arrano rickarr...@gmail.com wrote:
Hello,
I've been using the Channel API and for each chat room, I have a fixed
number of users. So for each room, prior to the launch, I call
create_channel for each user
Hello,
I was just hoping for some clarification: when the documentation says
that writes to entity groups are limited to 1 per second, does this
mean any write regardless of how much data is being written, or each
individual row being written? Basically, what if I have a list of
15-20 model
Hello,
I had a few items I've been taking for granted and I was wondering if
anyone could clear up for me. First of all, I did some searching and I
realize Google employees can't say exactly where the servers are.
However, one issue that affects me is I've been counting on
performance being equal
). if
possible, you'll want to keep some type of revision count so you don't
overwrite new with old.
If you provide more info someone can probably offer additional pointers.
Robert
On Wed, Mar 2, 2011 at 07:28, Richard Arrano rickarr...@gmail.com wrote:
Hello,
I was reading the thread
Hi Steve,
I would certainly agree about some sort of ability to guarantee low-
volume high-importance storage. I had an idea that it might be able to
be implemented in the form of something along the lines of 1MB
unevictable memcache storage; we wouldn't be able to store much there,
but we would
). Any help is much appreciated.
Thanks,
Richard Arrano
--
You received this message because you are subscribed to the Google Groups
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to
google-appengine
this problem, but it is still very very
laggy - I set it to send back a user's message sent to the channel and
it takes quite a few iterations of the loop to print this. Is there a
better way to structure this?
Thanks,
Richard Arrano
On Jan 16, 7:14 pm, Nick Johnson (Google) nick.john...@google.com
I'm looking to make a silent-auction type of application where you
have 20-30 users bidding on an item at a time, with potentially
hundreds or thousands of auctions happening simultaneously. As soon as
a high bid is made, it updates this information and sends it via the
Channel API to the other
://twitter.com/app_engine
On Fri, Jan 14, 2011 at 6:06 AM, Richard Arrano rickarr...@gmail.comwrote:
I'm looking to make a silent-auction type of application where you
have 20-30 users bidding on an item at a time, with potentially
hundreds or thousands of auctions happening simultaneously. As soon
44 matches
Mail list logo