[google-appengine] Re: Task Queues

2009-08-16 Thread Philippe

http://code.google.com/p/googleappengine/issues/list
you can star some issues on taskQueue. it's still in the lab of
appengine team.


On Aug 17, 12:21 am, Sebastian Aviña  wrote:
> Hi, I'm running some task queues on my app hola-dgo, right now I  have
> exceded the Total Daily Quota, and I still have around 2000+ tasks in
> queue... I don't know which tasks are queue, I can't see them, neither
> I can't delete, or flush the queue... What can I do?
>
> It's very hard to debug task queues on the production environment,
> because there is no way to know which tasks are queued...
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: All Read Please: Geographical Request Latency

2009-08-16 Thread gjs

Home Ajax request latency of 425ms
Page One Ajax request latency of 439ms
Page Two Ajax request latency of 420ms
More Ajax request latency of 836ms

After trying multiple times most responses are around 380~420ms

>From Australia ( with chrome on windoze xp )

Regards


On Aug 17, 9:02 am, GregF  wrote:
> Queenstown New Zealand: 440-490ms.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Python GAE version of Rails' validates_uniqueness_of

2009-08-16 Thread Daniel Rhoden
I discovered that my 'Account' model happily created multiple  
instances of user 't...@example.com' without complaint during  
testing.  (I know, fix my code).

Short question: What's the lightest way of ensuring value uniqueness  
between entity values?

I've read up on the Property class constructor argument "validator"  
and got to wondering if there is a RoR-like library of prefab  
validators.  It would be great if I could do something like this:

from google.appengine.ext import validation

class Account(db.Model):
   user  = db.UserProperty(required=True,  
validator=validation.unique(db.UserProperty) )
   ...

related resources:
http://code.google.com/appengine/docs/python/datastore/propertyclass.html
http://googleappengine.blogspot.com/2009/07/writing-custom-property-classes.html

--Daniel Rhoden




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: urlfetch don't work with non ASCII characters in PARAMETERS

2009-08-16 Thread Tim Hoffman

Hi

Have a look at things like rfc2141,rfc2396, I am not sure you can
directly include such characters in a urn legally.
I am quite possibly wrong but couldn't find any rfc's cover extended
char sets in a url/urn.

The problem is that there isn't a scheme for telling the url to change
chars sets
for instance utf-8 vs ascii or 16bit wide chars etc

So if you want to use characters in urls that aren't in the standards
as per the rfc above
then you need to use % 'hex' the characters to represent octets.

Hope I haven't given you a bum stear but I couldn't find anyactive
rfc's negating these two.

Hope this helps

Rgds

Tim



On Aug 16, 11:35 pm, Edmar  wrote:
> Every time I try to fetch a result using GAE urlfetch and one of the
> parameters have an non ASCII character I get an error
>
> I want to fetch this url :
>
> http://ajax.googleapis.com/ajax/services/search/web?v=1.0&rsz=large&q...ão
>
> my problem is with the q parameter if I put other words as a query it
> will work but when I use 'non english' words I see this error :
>
>  File "/base/python_lib/versions/1/google/appengine/api/
> urlfetch_service_pb.py", line 1596, in OutputUnchecked
>     out.putPrefixedString(self.url_)
>   File "/base/python_lib/versions/1/google/net/proto/
> ProtocolBuffer.py", line 365, in putPrefixedString
>     v = str(v)
> UnicodeEncodeError: 'ascii' codec can't encode characters in position
> 96-99: ordinal not in range(128)
>
> Some Times if
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: All Read Please: Geographical Request Latency

2009-08-16 Thread GregF

Queenstown New Zealand: 440-490ms.

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Task Queues

2009-08-16 Thread Sebastian Aviña

Hi, I'm running some task queues on my app hola-dgo, right now I  have
exceded the Total Daily Quota, and I still have around 2000+ tasks in
queue... I don't know which tasks are queue, I can't see them, neither
I can't delete, or flush the queue... What can I do?

It's very hard to debug task queues on the production environment,
because there is no way to know which tasks are queued...
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Massive datastore batch put, how to?

2009-08-16 Thread Stakka

I implemented a rough version of my solution, and it seems to work up
to ~15k entities. Above that I hit the undocumented transaction write
limit you mention when trying to commit 36408 entities serialized into
24 blobs of 60 bytes:

java.lang.IllegalArgumentException: datastore transaction or write too
big.

Well, the datastore seems fast enough for large dataset writes, but
all the limitations really makes it troublesome to implement. Also the
potential loss of data integrity while processing in multiple requests/
tasks without transactions is risky. Costly too, an 15k entities
"upload" comsumes about 30 minutes of CPU quota.


On 15 Aug, 19:57, Juraj Vitko  wrote:
> I agree with everything you said. Just one thing to consider: by first
> storing the uploaded data, then retrieving that data for reprocessing
> and then storing the processed data again will consume additional
> resources / quotas of your app.
>
> GAE really appears to be designed for apps with very high read to
> write ratio. I would say, if you don't need to handle more than
> thousand of concurrent users, then you'd be better off renting a
> server. Into this I've factored additional hassles you may not know
> about yet, like index size and count limits, single entity group write
> limits, transaction limitations. All of these are possible to work
> around, but I have yet to see if those workarounds are feasible in
> terms of the final price I will be paying to run the app.
>
> On Aug 14, 9:24 pm, Stakka  wrote:
>
> > Thanks for the tip, but why write a web app when Java Applets are
> > required, that whouldn't be a good solution. Also, the uploaded file
> > needs to be parsed in it's entirety (CRC check, value references,
> > etc.), and it's not XML.
>
> > I think I have to parse the file server-side, populate (Java) Entity
> > objects and serialize as many I can into 1 MB blobs. When that is
> > done, start a task that put the de-serialized entities in batches of
> > 500 into the datastore. The response for the file upload request will
> > have to contain some unique task URL that the browser can (AJAX) poll
> > to display the progress.
>
> > Before I commit to such a elaborate solution, I'll have to test the
> > batch-put performance to see if GAE is even suitable for this kind of
> > app.http://groups.google.com/group/google-appengine/browse_thread/thread/...
>
> > Users of an online apps shouldn't have to wait hours for a simple data
> > import just because it's hosted at GAE. If the app where using an SQL
> > database this would only take a minute.
>
> > On Aug 14, 4:48 pm, Juraj Vitko  wrote:
>
> > > I think you need to write your own Flash or Java Applet based chunked
> > > uploader. Or use an existing one and let us know, so that we can use
> > > it too.
>
> > > On Aug 12, 11:36 pm, Stakka  wrote:
>
> > > > I'am working on an browser based accounting app which has a feature to
> > > > import ledger transactions through file uploads. I'am currently only
> > > > running on the local dev server, but from what I've read datastore
> > > > puts -- even batch -- is very slow and CPU (quota) intensive when
> > > > deployed live.
>
> > > > How do I overcome this problem if the user uploads a large file with
> > > > thousands transaction?
>
> > > > I've seen solutions where you batch put entities in chunks of 500.
> > > > That only works if you run a custom upload tool on your computer, not
> > > > from a browser since the request is limited to 30 seconds. Am I forced
> > > > to use the Task Queue? But where do I store the raw uploaded file or
> > > > the preferably parsed interim transaction entities when the task isn't
> > > > executing?
>
> > > > Funny App Engine has a 10 megabyte request (file upload) size limit
> > > > when storing 10 megabyte worth of entities seems to be so hard.
>
>
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Transactionally updating multiple entities over 1MB

2009-08-16 Thread Stakka

Wierd, I just hit a limit on the size of a transaction when commiting:

"java.lang.IllegalArgumentException: datastore transaction or write
too big."

All (23) entities in the transaction where in the same entity group,
not using batch put and ~990k in size.


On 23 Juli, 12:30, "Nick Johnson (Google)" 
wrote:
> Hi Juraj,
>
> No, there's no limit to the size of an entity group - only on the maximum
> rate at which you can update entities in a single entity group.
>
> -Nick Johnson
>
>
>
> On Fri, Jul 17, 2009 at 4:03 PM, Juraj Vitko  wrote:
>
> > Nick, just one clarification (I can't find in docs) - is there a limit
> > on the total size of an entity group?
>
> > On Jun 29, 12:28 pm, "Nick Johnson (Google)" 
> > wrote:
> > > On Sat, Jun 27, 2009 at 4:14 PM, Andy Freeman
> > wrote:
>
> > > >> > Does that mean that db.put((e1, e2, e3,)) where all of the entities
> > > >> > are 500kb will fail?
>
> > > >> Yes.
>
> > > > Thanks.
>
> > > > I'll take this opportunity to promote a couple of related feature
> > > > requests.
>
> > > > (1) We need a way to estimate entity sizes
> > > >http://code.google.com/p/googleappengine/issues/detail?id=1084
>
> > > The 1MB limit is on the API call, rather than the entity itself,
> > > per-se, so index size doesn't count in the 1MB limit. You can always
> > > serialize the entity yourself and check its size, though that requires
> > > touching datastore-internal methods.
>
> > > > (2) We need a way to help predict when datastore operations will fail
> > > >http://code.google.com/p/googleappengine/issues/detail?id=917
>
> > > > I assume that db.get((k1, k2,)) can fail because of size reasons when
> > > > db.get(k1) followed by db.get(k2) will succeed.  Does db.get((k1,
> > > > k2,)) return at least one entity in that case?
>
> > > No, the operation will simply fail. Given that it's an invariant that
> > > the returned list has the same length as the passed list, there's no
> > > sensible way to return partial results without implying that certain
> > > entities didn't exist when they actually do.
>
> > > -Nick Johnson
>
> > > > On Jun 26, 9:36 am, "Nick Johnson (Google)" 
> > > > wrote:
> > > >> On Fri, Jun 26, 2009 at 4:42 PM, Andy Freeman 
> > wrote:
>
> > > >> > >  the 1MB limit applies only to single API calls
>
> > > >> > Does that mean that db.put((e1, e2, e3,)) where all of the entities
> > > >> > are 500kb will fail?
>
> > > >> Yes.
>
> > > >> > Where are limits on the total size per call documented?
>
> >http://code.google.com/appengine/docs/python/datastore/overview.html#...
> > > >> > only mentions a limit on the size of individual entities and the
> > total
> > > >> > number of entities for batch methods.  The batch method
> > documentation
> > > >> > (
> >http://code.google.com/appengine/docs/python/datastore/functions.html
> > > >> > andhttp://
> > code.google.com/appengine/docs/python/memcache/functions.html)
> > > >> > does not mention any limits.
>
> > > >> You're right - we need to improve our documentation in that area. The
> > 1MB
> > > >> limit applies to _all_ API calls.
>
> > > >> > Is there a documented limit on the number of entities per memcache
> > > >> > call?
>
> > > >> No.
>
> > > >> > BTW - There is a typo in
>
> >http://code.google.com/appengine/docs/python/memcache/overview.html#Q...
> > > >> > .
> > > >> > It says "In addition to quotas, the following limits apply to the
> > use
> > > >> > of the Mail service:" instead of "Memcache service"
>
> > > >> Thanks for the heads-up.
>
> > > >> -Nick Johnson
>
> > > >> > On Jun 26, 7:28 am, "Nick Johnson (Google)" <
> > nick.john...@google.com>
> > > >> > wrote:
> > > >> > > Hi tav,
>
> > > >> > > Batch puts aren't transactional unless all the entities are in the
> > > >> > > same entity group. Transactions, however, _are_ transactional, and
> > the
> > > >> > > 1MB limit applies only to single API calls, so you can make
> > multiple
> > > >> > > puts to the same entity group in a transaction.
>
> > > >> > > -Nick Johnson
>
> > > >> > > On Fri, Jun 26, 2009 at 8:53 AM, tav wrote:
>
> > > >> > > > Hey guys and girls,
>
> > > >> > > > I've got a situation where I'd have to "transactionally" update
> > > >> > > > multiple entities which would cumulatively be greater than the
> > 1MB
> > > >> > > > datastore API limit... is there a decent solution for this?
>
> > > >> > > > For example, let's say that I start off with entities E1, E2, E3
> > which
> > > >> > > > are all about 400kb each. All the entities are specific to a
> > given
> > > >> > > > User. I grab them all on a "remote node" and do some
> > calculations on
> > > >> > > > them to yield new "computed" entities E1', E2', and E3'.
>
> > > >> > > > Any failure of the remote node or the datastore is recoverable
> > except
> > > >> > > > when the remote node tries to *update* the datastore... in that
> > > >> > > > situation, it'd have to batch the update into 2 separate .put()
> > calls
> > > >> > > > to overcome the 1MB limit. And should the remote

[google-appengine] Re: urlfetch don't work with non ASCII characters in PARAMETERS

2009-08-16 Thread Sylvain

can you test this ?

import urllib

q = urllib.quote(q.encode('utf8'))
or just q = urllib.quote(q)

Regards

On Aug 16, 5:35 pm, Edmar  wrote:
> Every time I try to fetch a result using GAE urlfetch and one of the
> parameters have an non ASCII character I get an error
>
> I want to fetch this url :
>
> http://ajax.googleapis.com/ajax/services/search/web?v=1.0&rsz=large&q...ão
>
> my problem is with the q parameter if I put other words as a query it
> will work but when I use 'non english' words I see this error :
>
>  File "/base/python_lib/versions/1/google/appengine/api/
> urlfetch_service_pb.py", line 1596, in OutputUnchecked
>     out.putPrefixedString(self.url_)
>   File "/base/python_lib/versions/1/google/net/proto/
> ProtocolBuffer.py", line 365, in putPrefixedString
>     v = str(v)
> UnicodeEncodeError: 'ascii' codec can't encode characters in position
> 96-99: ordinal not in range(128)
>
> Some Times if
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] urlfetch don't work with non ASCII characters in PARAMETERS

2009-08-16 Thread Edmar

Every time I try to fetch a result using GAE urlfetch and one of the
parameters have an non ASCII character I get an error

I want to fetch this url :

http://ajax.googleapis.com/ajax/services/search/web?v=1.0&rsz=large&q=ação

my problem is with the q parameter if I put other words as a query it
will work but when I use 'non english' words I see this error :


 File "/base/python_lib/versions/1/google/appengine/api/
urlfetch_service_pb.py", line 1596, in OutputUnchecked
out.putPrefixedString(self.url_)
  File "/base/python_lib/versions/1/google/net/proto/
ProtocolBuffer.py", line 365, in putPrefixedString
v = str(v)
UnicodeEncodeError: 'ascii' codec can't encode characters in position
96-99: ordinal not in range(128)

Some Times if
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Group search problem

2009-08-16 Thread Juraj Vitko

This is real and serious issue.

I've just experienced it myself when searching for "getObjectsById" in
the appengine-java and only got one result.
Then I searched through the main google.com search, and got many
results for the same group.

When I saw this thread first time, I tried the query you were
referring to, and I think it returned more results than you said. So
the problem appears to be dynamic.

The search Jeff is suggesting below returns me zero results for the
above query.

So I'd say - use google.com with site:groups.google.com/... operator.

So, what's up, Google?



On Aug 10, 3:39 pm, jd  wrote:
> Hi,
>
> When I "search this group" for "transcript" there are no results.  But
> clearly there are results e.g.
>
> http://groups.google.com/group/google-appengine/browse_thread/thread/...
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: google-app-engine-django

2009-08-16 Thread Holger


Appengine has got two modes of operation.

High traffic applications are served directly out of the cache within
a second for example.

For low traffic applications first the uploded code has to be prepared
(unzipped for example) and put into the cache before it can be served.
Thus low traffic applications necessarly need a longer request time -
three seconds for example.

To my experience low traffic is everything below about 500 requests
per hour.

> New machine instantiations, which we all know happens quite frequently
New pages usually haven't a lot of visitors yet and thus are low
traffic apps. But a lot of longer existing pages have got low traffic
too.

Low traffic means additionally that each request consumes more
resources. Usually that doesn't matter as for almost all low traffic
pages these additional resources stay within the free quota.



When comparing perfomance of different solutions it's important that
the solutions themselves are comparable.

If you don't need session management, user name, user account and
administration you probably don't need Django and should use a simple
and faster solution.

What concerns the performance of a Django solution my tests with the
aep-sample copied to my own appID three month ago varied
for high traffic between  0.5 - 1.0 seconds request time
for low traffic between 2.5 - 4.5 seconds request time

I would be very interested to hear of current test results.


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---