[google-appengine] Re: Any Site in Production Use ?

2008-10-01 Thread Bill

Feris, Wordle.net is Jonathan's site so I don't have any info other
than looking at the web pages.  It looks the image thumbnails are
image blobs stored in datastore and the java applet displays graphics
on individual gallery pages.  So he'd have at least one intensive
request when doing the thumbnail processing and image put.  There's no
tagging or join models from what I can see, so his gallery page GET
should be reasonably fast.

I wonder what the most datastore-intensive and successful app out
there might be?  "Compare the Candidates" (http://
comparecandidates.appspot.com) looks reasonably complex with the
ability to select issues, source, etc, but I think that page might be
broken up into many requests via AJAX and the pieces reassembled.
Using a rich client (AJAX, Flex, etc.) that can handle server errors/
timeouts and multiple requests per page might be a good way to use App
Engine.

On Oct 1, 9:26 pm, "Feris Thia" <[EMAIL PROTECTED]> wrote:
> Hi Bill,
> Thanks for the sharing, for the puts I have tested myself using single
> thread and it get cpu over quota easily. So I think it is not ready for
> production yet.
>
> Indeed, wordle is very popular (page rank 7). I'd take a look at how they
> handle datastore, or do you have any further info for that ?
>
> Regards,
>
> Feris
>
> On Thu, Oct 2, 2008 at 11:00 AM, Bill <[EMAIL PROTECTED]> wrote:
>
> > You're talking about wordle.net, right?  Those are pretty successful
> > metrics.
>
> > A lot of the issues people have been having are on datastore timeouts,
> > particularly on puts.  It looks like the main processing for Wordle is
> > handled by a downloaded Java applet, and the datastore really gets
> > used when storing a wordle or returning a gallery page.  Do you run
> > into any timeouts or elevated CPU for wordle puts?  I'm using Google
> > Chrome and didn't have the Java plugin, so I noticed even your images
> > are created via the java applet so you're not storing image blobs (?).
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Any Site in Production Use ?

2008-10-01 Thread Bill

You're talking about wordle.net, right?  Those are pretty successful
metrics.

A lot of the issues people have been having are on datastore timeouts,
particularly on puts.  It looks like the main processing for Wordle is
handled by a downloaded Java applet, and the datastore really gets
used when storing a wordle or returning a gallery page.  Do you run
into any timeouts or elevated CPU for wordle puts?  I'm using Google
Chrome and didn't have the Java plugin, so I noticed even your images
are created via the java applet so you're not storing image blobs (?).


On Oct 1, 5:33 pm, Jonathan Feinberg <[EMAIL PROTECTED]> wrote:
> On Sep 30, 5:21 am, "Feris Thia" <[EMAIL PROTECTED]> wrote:
>
> > I understand that GAE is still in preview used only, but I just wonder if
> > anyone has used it in any semi or full production ? And with how many page
> > hits / visitors per day ?
>
> Yesterday was a good day; my app got 89,537 pageviews, according to
> Google Analytics. I typically hum along at 10-13 requests per second
> during the work day, according to the dashboard.
>
> The great thing about GAE is that I don't have to worry about being
> "slashdotted". If you do what you're supposed to do, App Engine just
> shrugs off a few hours of 50 requests per second. It has been a joy.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Datastore Key IDs

2008-10-01 Thread theo

They are supposed to be unique.  I have managed to make duplicate id's
though.  Keys are always unique, as far as I can tell.

On Oct 1, 9:31 am, ae <[EMAIL PROTECTED]> wrote:
> Hi,
>
> just wondering, do the datastore key ids get recycled for an entity?
> or are they guaranteed to by unique for an entity?  e.g. if key id 7
> to 10 gets deleted, do those key id ever get used again?
>
> Thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: : Custom HTTP 404/500 messages

2008-10-01 Thread Alexander Kojevnikov

> I could not find any instructions on how to use settings.py with the
> plain GAE. However, I came accross one of your older posts here:
>
> http://groups.google.com/group/google-appengine/browse_thread/thread/...
>
> In fact, that is exactly what I need! The idea of declaring a
> BaseHandler is just brilliant for dealing with Error 500, while the
> catch-all approach deals well with 404.

Built-in Django 0.96 and the webapp framework (which you call plain
GAE) are completely different beasts. The templates trick works only
for Django and there's no settings.py in webapp. If you are using the
latter, you can indeed use the base handler to catch 500 errors and a
catch-all handler to deal with 404.

Cheers,
Alex
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Any Site in Production Use ?

2008-10-01 Thread Jonathan Feinberg

On Sep 30, 5:21 am, "Feris Thia" <[EMAIL PROTECTED]> wrote:

> I understand that GAE is still in preview used only, but I just wonder if
> anyone has used it in any semi or full production ? And with how many page
> hits / visitors per day ?

Yesterday was a good day; my app got 89,537 pageviews, according to
Google Analytics. I typically hum along at 10-13 requests per second
during the work day, according to the dashboard.

The great thing about GAE is that I don't have to worry about being
"slashdotted". If you do what you're supposed to do, App Engine just
shrugs off a few hours of 50 requests per second. It has been a joy.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] undocumented size cap on memcache entries

2008-10-01 Thread Mahmoud

Just a heads up, in case you get bit by this like we did:

It seems that there is an undocumented size cap of 1 MB on memcache
entries. Our application caches a list of entities with a thumbnail
BlobProperty, and we weren't putting memcache.set() in a try/catch
block. It turns out, that it stopped working after a while, with the
following error in the logs:

"""
  File "/base/python_lib/versions/1/google/appengine/api/memcache/
__init__.py", line 539, in set
return self._set_with_policy(MemcacheSetRequest.SET, key, value,
time=time)
  File "/base/python_lib/versions/1/google/appengine/api/memcache/
__init__.py", line 602, in _set_with_policy
stored_value, flags = _validate_encode_value(value,
self._do_pickle)
  File "/base/python_lib/versions/1/google/appengine/api/memcache/
__init__.py", line 176, in _validate_encode_value
'received %d bytes' % (MAX_VALUE_SIZE, len(stored_value)))
ValueError: Values may not be more than 100 bytes in length;
received 1000685 bytes
"""

I guess we'll have to make those thumbnails into ReferenceProperties,
put them in their own entities and memcache them separately.

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: A few feature suggestions

2008-10-01 Thread djidjadji

The  refers to the major version number in the app.yaml file.
You only see one subversion number for every major version, the
subversion is incremented on each upload.
Now the link does not have a sub-version but the word latest, to
enable to bookmark it.
http://2.latest.myapp.appspot.com
This makes it possible to try out a new version of the code without
troubeling the current users. Make it the default when you think it's
stable.


2008/9/30 Robert Schultz <[EMAIL PROTECTED]>:
>
> - I am not sure I understand the logic behind the 
> http://..appspot.com
> approach.  I would expect if I put in http:// version>..appspot.com I could view that version at that point
> but the only version I can ever see is the latest.
> - Show each previous version in the admini console Versions list with

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: ProtocolBufferDecodeError

2008-10-01 Thread Oliver Zeigermann

I get the exception at exactly the same line while rendering a
template:

Traceback (most recent call last):
  File "/base/python_lib/versions/1/google/appengine/ext/webapp/
__init__.py", line 499, in __call__
handler.get(*groups)
  File "/base/data/home/apps/dailydogpicture/1.30/main.py", line 96,
in get
MainHandler.display(self)
  File "/base/data/home/apps/dailydogpicture/1.30/main.py", line 93,
in display
self.response.out.write(template.render('main.html', values))
  File "/base/python_lib/versions/1/google/appengine/ext/webapp/
template.py", line 81, in render
return t.render(Context(template_dict))
  File "/base/python_lib/versions/1/google/appengine/ext/webapp/
template.py", line 121, in wrap_render
return orig_render(context)
  File "/base/python_lib/versions/1/django/template/__init__.py", line
168, in render
return self.nodelist.render(context)
  File "/base/python_lib/versions/1/django/template/__init__.py", line
705, in render
bits.append(self.render_node(node, context))
  File "/base/python_lib/versions/1/django/template/__init__.py", line
718, in render_node
return(node.render(context))
  File "/base/python_lib/versions/1/django/template/defaulttags.py",
line 99, in render
values = list(values)
  File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 1257, in __iter__
return self.run()
  File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 1589, in run
query_run = self._proto_query.Run(*self._args, **self._kwds)
  File "/base/python_lib/versions/1/google/appengine/ext/gql/
__init__.py", line 581, in Run
res = bind_results.Get(self.__limit, offset)
  File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 938, in Get
return self._Run(limit, offset)._Next(limit)
  File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 1225, in _Next
apiproxy_stub_map.MakeSyncCall('datastore_v3', 'Next', req,
result)
  File "/base/python_lib/versions/1/google/appengine/api/
apiproxy_stub_map.py", line 46, in MakeSyncCall
stub.MakeSyncCall(service, call, request, response)
  File "/base/python_lib/versions/1/google/appengine/runtime/
apiproxy.py", line 246, in MakeSyncCall
rpc.CheckSuccess()
  File "/base/python_lib/versions/1/google/appengine/runtime/
apiproxy.py", line 189, in CheckSuccess
raise self.exception
ProtocolBufferDecodeError: Problem parsing scalar value into field:
stringValue

Sometimes at the same line a have memory error:

File "/base/python_lib/versions/1/google/appengine/runtime/
apiproxy.py", line 189, in CheckSuccess
raise self.exception
MemoryError

I have just introduced memory cache. Are there any memory restrictions
*I* have to obey when using it?

Thanks in advance

Oliver

On 1 Okt., 19:43, "Marzia Niccolai" <[EMAIL PROTECTED]> wrote:
> Hi,
>
> It seems like there is an issue reading the results from the datastore,
> probably from some corrupt data.
>
> Can you provide more details on the code that raises this error, the data
> model you are using, and what data is stored in your datastore?
>
> It's likely that the only way to fix this is to remove the data that is
> causing the error.
>
> -Marzia
>
> On Tue, Sep 30, 2008 at 7:09 PM, Jeff <[EMAIL PROTECTED]> wrote:
>
> > Has anyone seen this error before?  It is coming up ~ every 5th
> > attempt to load the site and I can't figure out what this is.  Anyone
> > have ideas about what could be causing this?  It seems to be happening
> > during a fetch of items from the Datastore.  Any suggestions on how to
> > fix this?
>
> > ProtocolBufferDecodeError at /
> > Problem parsing scalar value into field: stringValue
> > Request Method:         GET
> > Request URL:    http://3.latest.truefoodies.appspot.com/
> > Exception Type:         ProtocolBufferDecodeError
> > Exception Value:        Problem parsing scalar value into field:
> > stringValue
> > Exception Location:     /base/python_lib/versions/1/google/appengine/
> > runtime/apiproxy.py in CheckSuccess, line 189

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: ProtocolBufferDecodeError

2008-10-01 Thread Oliver Zeigermann

I have the same problem, but the data must be correct as most of the
time it works fine.

Oliver

On 1 Okt., 19:43, "Marzia Niccolai" <[EMAIL PROTECTED]> wrote:
> Hi,
>
> It seems like there is an issue reading the results from the datastore,
> probably from some corrupt data.
>
> Can you provide more details on the code that raises this error, the data
> model you are using, and what data is stored in your datastore?
>
> It's likely that the only way to fix this is to remove the data that is
> causing the error.
>
> -Marzia
>
> On Tue, Sep 30, 2008 at 7:09 PM, Jeff <[EMAIL PROTECTED]> wrote:
>
> > Has anyone seen this error before?  It is coming up ~ every 5th
> > attempt to load the site and I can't figure out what this is.  Anyone
> > have ideas about what could be causing this?  It seems to be happening
> > during a fetch of items from the Datastore.  Any suggestions on how to
> > fix this?
>
> > ProtocolBufferDecodeError at /
> > Problem parsing scalar value into field: stringValue
> > Request Method:         GET
> > Request URL:    http://3.latest.truefoodies.appspot.com/
> > Exception Type:         ProtocolBufferDecodeError
> > Exception Value:        Problem parsing scalar value into field:
> > stringValue
> > Exception Location:     /base/python_lib/versions/1/google/appengine/
> > runtime/apiproxy.py in CheckSuccess, line 189

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Any Site in Production Use ?

2008-10-01 Thread Sylvain

Hi,

Currently my app get 15k requests a day (20 req/min in the high part).

Everything works very well :
- it's fast (particularly with memcache)
- very good framework : webapp, django,...
- nice dashboard
- easy to update

Except :
- the CPU/req warning : very hard to handle and now I don't want to
add new functions because I'm at the CPU limits.

Regards


On 1 oct, 07:32, Bill <[EMAIL PROTECTED]> wrote:
> I know buddypoke.com is using AppEngine.  It's a fairly successful app
> under heavy load.  (I want to say hundreds of requests per second but
> can't say my memory is correct there.)
>
> On Sep 30, 2:21 am, "Feris Thia" <[EMAIL PROTECTED]> wrote:
>
> > Hi All,
>
> > I understand that GAE is still in preview used only, but I just wonder if
> > anyone has used it in any semi or full production ? And with how many page
> > hits / visitors per day ?
>
> > --
> > Thanks & Best Regards,
>
> > Feris Thia
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Authenticated RPC from gadget to appengine service?

2008-10-01 Thread Duane

I have created a gadget and can make a simple RPC request to a service
I have implemented using the app engine.  I would like to store and
retrieve user specific information using the RPC service, but I want
to do it in a secure way.

How can I identify the logged in user of the gadget container
(OpenSocial container) and provide this information to the service in
a secure way?  I have looked at OAuth, but this appears to be a way to
allow users to authorize access to data from third-party gadgets.
Since I am the gadget and service author, I don' think this is
necessary.  My gadget is "phoning home."  I think I just need to get
the identity of the logged in user in a secure way.

The container makeRequest function does support OAUTH and SIGNED
authorization types, but these don't seem like what I need.

I would appreciate help on this.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: multiple or single trips to the data store (speed and cpu cycles)

2008-10-01 Thread Sylvain

Hi,

Could you explain me with it uses a dic to cach the query :
### GQL query cache ###
_query_cache = {}

and not memcache.

I'd like to have a "caching" best practice : global variable VS
memcache

Thank you

Regards



On 1 oct, 07:25, Bill <[EMAIL PROTECTED]> wrote:
> Try using fetch instead of treating the GqlQuery object as an
> iterable.  If you had included a LIMIT or OFFSET clause, it would
> automatically be retrieved by fetch.
> So it would be something like:
>
> users = db.GqlQuery(toquery).fetch(limit=1000)
>
> Seehttp://code.google.com/appengine/docs/datastore/queryclass.html
>
> You can also cache the query like rietveld.
> Seehttp://code.google.com/p/rietveld/source/browse/trunk/codereview/mode...
> and look at the gql method.
>
> On Sep 30, 7:17 pm, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
>
> > my app is producing significantly high mcycles used results. i have
> > debugged it and determined that the problem is in my for loop. So I
> > tried two versions and both are too slow.
>
> > Version 1:
>
> > somestuff = ['cat', 'dog', 'cow'];
>
> > toquery = "SELECT * FROM Animals Where id IN (" + somestuff + ")"
> > users = db.GqlQuery(toquery)
>
> > now i get a list of users that match that key and I do a for loop like
> > this:
>
> > for user in users:
> >self.response.out.write(user.id)
>
> > Gives me a lot of mcycles with a red hazard warning. If I remove the
> > for loop the hazard goes away. The datastore query takes less then a
> > second.
>
> > VERSION 2
>
> > somestuff = ['cat', 'dog', 'cow'];
>
> > for animal in somestuff:
> >toquery = "SELECT * FROM Moodster Where id = '" + animal + '\''
> >for user in users:
> >self.response.out.write(user.id)
>
> > in the latter version I am doing many queries (in this case 3) but
> > because there is only one user in users for each query (i make sure of
> > this), it takes less mcycles and gets a faster response
>
> > AM I DOING SOMETHING WRONG?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Should each developer really have to pack his libs with his app?

2008-10-01 Thread Peter Recore

Is there a particular pain point you are trying to fix with this
feature? ie you wish your libraries weren't using up most of your disk
quota, or it bothers your DRY principles to have the same library
copied into every app version you upload?  or are you hoping that the
libraries uploaded via this shell would be available to all apps, so
that once one person uploads a library, all apps can use it?


On Oct 1, 12:44 pm, Terrence Brannon <[EMAIL PROTECTED]> wrote:
> It seems monolithic and redundant to not have a way to install certain
> python libraries, hell why not all of pypi!
>
> But really, the place for libraries is not my application directory.
>
> If we can get a sandbox shell to install things in a library place,
> that is much better than expecting us to have it with our app code.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: TypeError: 'NoneType' object is unsubscriptable

2008-10-01 Thread Marzia Niccolai
These types of queries don't usually need indexes - unless they are too
difficult to do without the index.  In this case you would need an index
that is along the lines of:
- kind: Article
  properties:
  - name: __searchable_text_index
  - name: __searchable_text_index
  - name: __searchable_text_index
etc, depending on the number of keywords you are using in your query.

Unfortunately, queries needing this type of index have a high correspondence
rate with exploding indexes (
http://code.google.com/appengine/docs/datastore/queriesandindexes.html#Big_Entities_and_Exploding_Indexes),
so it's probable that this index can't be built.

-Marzia

On Tue, Sep 30, 2008 at 3:35 PM, Venkatesh Rangarajan <
[EMAIL PROTECTED]> wrote:

> Its a search able model ? The documentation says they don't need indexes.
> Can you please advice me what index I should add ?
>
> Here is my query below.
>
> def db_visas(keyword, offset):
>  visas=[]
>  query = search.SearchableQuery('Visa')
>  query.Search(keyword)
>  for result in query.Get(101, offset):
>   visas.append(result)
>  return visas
>
>
>
> >
>

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: TypeError: 'NoneType' object is unsubscriptable

2008-10-01 Thread Marzia Niccolai
Hi,

This error seems like it is different.  From your description, with out
seeing any code, it seems as though your entity doesn't have any value
stored in the spd field, so when you are trying to multiply it with an
integer value it's throwing an error.  What happens when you just write out
the field segment.spd?

For further help, code snippets would be greatly appreciated.

-Marzia

On Wed, Oct 1, 2008 at 12:27 AM, lf.hl <[EMAIL PROTECTED]> wrote:

>
> Hi Marzia,
>
> it seems i have a similar problem here and thought the missing index
> file is the reason for this. I have a model "Segment" with an integer
> member "spd". Multiplying "segment.spd" as an instance i got from a
> query on "Segment" with some int value python tells me the TypeError
> with int and NoneType. I don't have that problem on 32bit Ubuntu but
> on 64 bit Ubuntu and on GAE. I defined an index file manually but
> appcfg tells me i don't need it for that type of query.
>
> I got rid of the problem on 64bit Ubuntu by realligning the code. But
> that code did not work in GAE.
>
> Best, lf.hl
>
> On 1 Okt., 00:32, "Marzia Niccolai" <[EMAIL PROTECTED]> wrote:
> > Hi,
> >
> > Actually, upon further inspection, this is an issue of the query you are
> > running needing an index, but since the query doesn't need one in most
> > cases, we can't print the definition.
> >
> > Please add indexes for the queries experiencing these issues.
> >
> > -Marzia
> >
> > On Tue, Sep 30, 2008 at 2:47 PM, Venkatesh Rangarajan <
> >
> > [EMAIL PROTECTED]> wrote:
> > > And yes, my model is defined as a searchable entity. Using the default
> > > bulk-loader code with no tweaking.
> >
> > > def HandleEntity(self, entity):
> > > ent = search.SearchableEntity(entity)
> > > return ent
> >
> >
>
> >
>

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Timeout - operation took too long

2008-10-01 Thread Alex Epshteyn

I've been seeing Timeouts on data put operations about 30-80 times a
day (which accounts for as much as 1-3% of all write requests) ever
since my app went into production on August 18.  This is happening
every day and it's very annoying.   It's worse during some periods,
(e.g. when the GAE team reports outages), but it's been a constant
factor regardless of these issues.

It would be awesome if the App Engine team could investigate and put
together a set of best practices for coding and data modeling to
reduce the chance of timeouts, if this is at all possible.

Certainly it's a good idea to code defensively and expect these sorts
of exceptions, but there has to be a way to lower the timeout rate.
Everyone's logs are filling up with garbage because of this and making
it hard to find any other important failures in the logs.

Here are some example stack traces I've been getting:

  1) On Model.get_or_insert()

  a).
File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 862, in get_or_insert
  return run_in_transaction(txn)
File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 1407, in RunInTransaction
  result = function(*args, **kwargs)
File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 857, in txn
  entity = cls.get_by_key_name(key_name,
parent=kwds.get('parent'))
File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 779, in get_by_key_name
  return get(*keys)
File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 974, in get
  entities = datastore.Get(keys)
File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 211, in Get
  _MaybeSetupTransaction(req, keys)
File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 1502, in _MaybeSetupTransaction
  tx.handle)
File "/base/python_lib/versions/1/google/appengine/api/
apiproxy_stub_map.py", line 46, in MakeSyncCall
  stub.MakeSyncCall(service, call, request, response)
File "/base/python_lib/versions/1/google/appengine/runtime/
apiproxy.py", line 245, in MakeSyncCall
  rpc.Wait()
File "/base/python_lib/versions/1/google/appengine/runtime/
apiproxy.py", line 161, in Wait
  rpc_completed =
_apphosting_runtime___python__apiproxy.Wait(self)
DeadlineExceededError

  b).

  File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 862, in get_or_insert
return run_in_transaction(txn)
  File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 1431, in RunInTransaction
tx.handle, resp)
  File "/base/python_lib/versions/1/google/appengine/api/
apiproxy_stub_map.py", line 46, in MakeSyncCall
stub.MakeSyncCall(service, call, request, response)
  File "/base/python_lib/versions/1/google/appengine/runtime/
apiproxy.py", line 245, in MakeSyncCall
rpc.Wait()
  File "/base/python_lib/versions/1/google/appengine/runtime/
apiproxy.py", line 161, in Wait
rpc_completed = _apphosting_runtime___python__apiproxy.Wait(self)
  File "/base/python_lib/versions/1/google/appengine/runtime/
apiproxy.py", line 216, in __MakeCallDone
exception_entry[1] % (self.package, self.call))
  DeadlineExceededError

  c).

File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 862, in get_or_insert
  return run_in_transaction(txn)
File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 1407, in RunInTransaction
  result = function(*args, **kwargs)
File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 860, in txn
  entity.put()
File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 618, in put
  return datastore.Put(self._entity)
File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 162, in Put
  raise _ToDatastoreError(err)
File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 1627, in _ToDatastoreError
  raise errors[err.application_error](err.error_detail)
  Timeout

  d). (this is a very frequent trace)

File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 862, in get_or_insert
  return run_in_transaction(txn)
File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 1441, in RunInTransaction
  raise _ToDatastoreError(err)
File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 1627, in _ToDatastoreError
  raise errors[err.application_error](err.error_detail)
  Timeout

  2). On Model.save()

  a).
  File "/base/python_lib/versions/1/google/appengine/ext/db/
__init__.py", line 618, in put
return datastore.Put(self._entity)
  File "/base/python_lib/versions/1/google/appengine/api/
datastore.py", line 160, in Put
apiproxy_stub_map.MakeSyncCall('datastore_v3', 'Put', req, resp)
  File "/base/python_lib/versions/1/google/appengine

[google-appengine] Re: Release of AppEngine 1.1.4 SDK

2008-10-01 Thread Alex Epshteyn

What is the proper upgrade procedure for Windows?  I've just been
installing over the previous files (same directory).  Is this correct
or not?

Good to see releases coming out so frequently.

Alex

On Sep 27, 1:12 am, Rafe <[EMAIL PROTECTED]> wrote:
>   Hello,
>
>   This evening we have released the AppEngine 1.1.4 SDK.  It addresses
> some of the problems that Windows users have had with static files on
> dev_appserver.  If you are working on Windows and an existing
> application started telling you something like the following message,
> this fix is for you:
>
>   configuration file invalid:
>   regex does not compile: bogus escape: '\\xa'
>
>   You can download it from the project site here:
>
>  http://code.google.com/p/googleappengine/downloads/list
>
>   - Rafe Kaplan
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: using javascript libraries

2008-10-01 Thread Wooble

Javascript doesn't run on the GAE servers so the sandbox is completely
irrelevant to it.  You can serve, as static files, any javascripts you
want; it's up to your user's browser whether to execute it.

On Oct 1, 12:33 pm, Terrence Brannon <[EMAIL PROTECTED]> wrote:
> The relevant 
> FAQ:http://code.google.com/appengine/kb/commontasks.html#thirdparty
>
> states that only Python 3rd party apps can be used.
>
> But what if there is a particular javascript library that I want to
> make use of?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: using javascript libraries

2008-10-01 Thread Barry Hunter

AppEngine can only execute Python Code, so that is all you can upload
(and expect to run)

But Javascript is (normally) a client side language, so you can upload
a javascript lib to appengine as a static file, which the browser just
downloads and runs *itself*



On Wed, Oct 1, 2008 at 5:33 PM, Terrence Brannon <[EMAIL PROTECTED]> wrote:
>
> The relevant FAQ: 
> http://code.google.com/appengine/kb/commontasks.html#thirdparty
>
> states that only Python 3rd party apps can be used.
>
> But what if there is a particular javascript library that I want to
> make use of?
>
> >
>



-- 
Barry

- www.nearby.org.uk - www.geograph.org.uk -

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Timeout : how to get rid of timeouts ?

2008-10-01 Thread Venkatesh Rangarajan
Hello there,


Question 1 : I have an entity with 700K records. I am running search queries
on those and keep getting timeout error consistently.
Please advice on how I could avoid getting into these errors ? I have added
and __searchable_text_index and its is "building" for more than 24 hours.

Question 2 : Did anybody figure out how to retrieve more than 1000 records ?

Question 3: Is google working on providing "wrappers" for datastore ? Like a
wrapper to get count(*) or sum(column) or retrieve more than 1000 records or
resume bulk upload etc. I know each one of us is writing our own code to
achieve this, but it would be desirable that google provides optimized
wrappers to make developers life easier. Please add another level of
abstraction to datastore.

I am starting to get frustrated with Google Appengine. Run into
performance/scalability issues for simple operations ...simple things take
way too long and they never scale.

The marketing statements below are false at this point... things are way too
convoluted at this point.
 *No assembly required.*
Google App Engine exposes a fully-integrated development
environment.
   *It's easy to scale.*
Google App Engine makes it easy to design scalable applications that grow
from one to millions of users without infrastructure headaches.

My error code

Traceback (most recent call last):
  File "/base/python_lib/versions/1/google/appengine/ext/webapp/__init__.py",
line 499, in __call__
handler.get(*groups)
  File "/base/data/home/apps/payrate/9.10/Main.py", line 176, in get
for result in get_visas(keyword, int(page),s):
  File "/base/data/home/apps/payrate/9.10/Main.py", line 138, in get_visas
visas = db_visas(keyword,offset)
  File "/base/data/home/apps/payrate/9.10/Main.py", line 148, in db_visas
for result in query.Get(101, offset):
  File "/base/python_lib/versions/1/google/appengine/api/datastore.py",
line 938, in Get
return self._Run(limit, offset)._Next(limit)
  File "/base/python_lib/versions/1/google/appengine/api/datastore.py",
line 1227, in _Next
raise _ToDatastoreError(err)
  File "/base/python_lib/versions/1/google/appengine/api/datastore.py",
line 1627, in _ToDatastoreError
raise errors[err.application_error](err.error_detail)
Timeout: datastore timeout: operation took too long.

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: multiple or single trips to the data store (speed and cpu cycles)

2008-10-01 Thread Marzia Niccolai
Hi,

In addition to Bill's advice, it's worth noting that the IN query behind the
scenes just ends up performing and aggregating multiple 'equals' queries.

>From http://code.google.com/appengine/docs/datastore/gqlreference.html:

"*Note:* The IN and != operators use multiple queries behind the scenes. For
example, the IN operator executes a separate underlying datastore query for
every item in the list. The entities returned are a result of the
cross-product of all the underlying datastore queries and are de-duplicated.
A maximum of 30 datastore queries are allowed for any single GQL query."

-Marzia

On Tue, Sep 30, 2008 at 10:25 PM, Bill <[EMAIL PROTECTED]> wrote:

>
> Try using fetch instead of treating the GqlQuery object as an
> iterable.  If you had included a LIMIT or OFFSET clause, it would
> automatically be retrieved by fetch.
> So it would be something like:
>
> users = db.GqlQuery(toquery).fetch(limit=1000)
>
> See http://code.google.com/appengine/docs/datastore/queryclass.html
>
> You can also cache the query like rietveld.
> See
> http://code.google.com/p/rietveld/source/browse/trunk/codereview/models.py
> and look at the gql method.
>
>
> On Sep 30, 7:17 pm, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> > my app is producing significantly high mcycles used results. i have
> > debugged it and determined that the problem is in my for loop. So I
> > tried two versions and both are too slow.
> >
> > Version 1:
> >
> > somestuff = ['cat', 'dog', 'cow'];
> >
> > toquery = "SELECT * FROM Animals Where id IN (" + somestuff + ")"
> > users = db.GqlQuery(toquery)
> >
> > now i get a list of users that match that key and I do a for loop like
> > this:
> >
> > for user in users:
> >self.response.out.write(user.id)
> >
> > Gives me a lot of mcycles with a red hazard warning. If I remove the
> > for loop the hazard goes away. The datastore query takes less then a
> > second.
> >
> > VERSION 2
> >
> > somestuff = ['cat', 'dog', 'cow'];
> >
> > for animal in somestuff:
> >toquery = "SELECT * FROM Moodster Where id = '" + animal + '\''
> >for user in users:
> >self.response.out.write(user.id)
> >
> > in the latter version I am doing many queries (in this case 3) but
> > because there is only one user in users for each query (i make sure of
> > this), it takes less mcycles and gets a faster response
> >
> > AM I DOING SOMETHING WRONG?
> >
>

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: What to expect if an account is created after a User instance?

2008-10-01 Thread Marzia Niccolai
Hi,

Taking the last question first, App Engine allows authentication against
either Google Accounts or Google Apps. A full explanation of the options can
be found at: http://code.google.com/appengine/articles/auth.html

As for the first part, which is the same for both Google Accounts and Google
Apps accounts, this is covered in our documentation (
http://code.google.com/appengine/docs/users/userobjects.html):

"If the User constructor is called with an email address that does not
correspond with a valid Google account, the object will be created but it
will not correspond with a real Google account. This will be the case even
if someone creates a Google account with the given email address after the
object is stored. A User value with an email address that does not represent
a Google account at the time it is created will never match a User value
that represents a real user."

-Marzia

On Tue, Sep 30, 2008 at 10:08 PM, pr3d4t0r <[EMAIL PROTECTED]> wrote:

>
> Greetings.
>
> I'm trying to determine what the behaviour for a User instance (and
> the application) is if the Google Account is created *after* a User is
> instantiated and added to the Datastore.
>
> Scenario 1:
>
> a) User instance is created with email = [EMAIL PROTECTED]
> b) [EMAIL PROTECTED] is crated in Gmail
> c) Application checks if it has a valid user; if not, user signs in
> d) User continues to use the application
>
>
> Senario 2:
>
> Same as scenario 1 except that the email address is
> [EMAIL PROTECTED]; somedomain.com is a Google hosted domain
> with Google Apps.
>
>
> I'm almost 100% sure that the User object in the Datastore has no
> relevance until the matching account is created.  Prior to that it's
> just taking space in the Datastore but it's otherwise unreachable, at
> least from the application's point of view, until an account is
> created and it can be associated with it through the hasValidUser() or
> the get_current_user() methods.
>
> Has Google published any plans to allow sign in through sign ins for
> third-party domains that are Google hosted?
>
> Thanks in advance and cheers,
>
> pr3d4t0r
> http://www.istheserverup.com
>
> >
>

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: ProtocolBufferDecodeError

2008-10-01 Thread Marzia Niccolai
Hi,

It seems like there is an issue reading the results from the datastore,
probably from some corrupt data.

Can you provide more details on the code that raises this error, the data
model you are using, and what data is stored in your datastore?

It's likely that the only way to fix this is to remove the data that is
causing the error.

-Marzia

On Tue, Sep 30, 2008 at 7:09 PM, Jeff <[EMAIL PROTECTED]> wrote:

>
> Has anyone seen this error before?  It is coming up ~ every 5th
> attempt to load the site and I can't figure out what this is.  Anyone
> have ideas about what could be causing this?  It seems to be happening
> during a fetch of items from the Datastore.  Any suggestions on how to
> fix this?
>
> ProtocolBufferDecodeError at /
> Problem parsing scalar value into field: stringValue
> Request Method: GET
> Request URL:http://3.latest.truefoodies.appspot.com/
> Exception Type: ProtocolBufferDecodeError
> Exception Value:Problem parsing scalar value into field:
> stringValue
> Exception Location: /base/python_lib/versions/1/google/appengine/
> runtime/apiproxy.py in CheckSuccess, line 189
>
> >
>

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Should each developer really have to pack his libs with his app?

2008-10-01 Thread Terrence Brannon

It seems monolithic and redundant to not have a way to install certain
python libraries, hell why not all of pypi!

But really, the place for libraries is not my application directory.

If we can get a sandbox shell to install things in a library place,
that is much better than expecting us to have it with our app code.


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] using javascript libraries

2008-10-01 Thread Terrence Brannon

The relevant FAQ: 
http://code.google.com/appengine/kb/commontasks.html#thirdparty

states that only Python 3rd party apps can be used.

But what if there is a particular javascript library that I want to
make use of?

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Datastore Key IDs

2008-10-01 Thread ae

Hi,

just wondering, do the datastore key ids get recycled for an entity?
or are they guaranteed to by unique for an entity?  e.g. if key id 7
to 10 gets deleted, do those key id ever get used again?

Thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Contract development for an application?

2008-10-01 Thread Mark From Dallas

Can anyone recommend a great location where I can post notice of a
development job to convert a simple online database app (mysql, php,
html) to run on the App Engine?

I have a job open on RentACoder for the work but I would like to get
spread the word among App Engine developers as well.

Thanks.

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: TypeError: 'NoneType' object is unsubscriptable

2008-10-01 Thread lf.hl

Hi Marzia,

it seems i have a similar problem here and thought the missing index
file is the reason for this. I have a model "Segment" with an integer
member "spd". Multiplying "segment.spd" as an instance i got from a
query on "Segment" with some int value python tells me the TypeError
with int and NoneType. I don't have that problem on 32bit Ubuntu but
on 64 bit Ubuntu and on GAE. I defined an index file manually but
appcfg tells me i don't need it for that type of query.

I got rid of the problem on 64bit Ubuntu by realligning the code. But
that code did not work in GAE.

Best, lf.hl

On 1 Okt., 00:32, "Marzia Niccolai" <[EMAIL PROTECTED]> wrote:
> Hi,
>
> Actually, upon further inspection, this is an issue of the query you are
> running needing an index, but since the query doesn't need one in most
> cases, we can't print the definition.
>
> Please add indexes for the queries experiencing these issues.
>
> -Marzia
>
> On Tue, Sep 30, 2008 at 2:47 PM, Venkatesh Rangarajan <
>
> [EMAIL PROTECTED]> wrote:
> > And yes, my model is defined as a searchable entity. Using the default
> > bulk-loader code with no tweaking.
>
> > def HandleEntity(self, entity):
> >     ent = search.SearchableEntity(entity)
> >     return ent
>
>

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Any Site in Production Use ?

2008-10-01 Thread Feris Thia
Hi Theo,

On Tue, Sep 30, 2008 at 9:53 PM, theo <[EMAIL PROTECTED]> wrote:

> I'd like to second this, and perhaps offer a bit of clarification.
>
> 1. Poor documentation.
>
>
Wow, I just know this... I'll be more aware from now on.


>
>
> 2. Poor uptime compared to commercial services
>
> Everyone and their mother was atwitter over the S3 downtime some time
> ago.  App Engine regularly has such downtimes.  App Engine is a much
> more complex system to manage, so that is to be expected somewhat,
> still - take that into account.  To partially mitigate this, you can
> follow a few websites:
>
> cloudstatus.com
> http://groups.google.com/group/google-appengine-downtime-notify
>

I have bookmarked the site and joined the group. Thanks for the references.

>
>
> Besides that, the platform is great.  The quotas are amazingly high
> (if you don't hit the high CPU quota).  The really great thing is that
> App Engine (especially with the django helper) is essentially Django,
> so there is little risk in developing for App Engine.  If you find
> that you can't continue to host your project on App Engine, you can
> easily port it over to Django on something like slicehost or AWS.
>

That's what I'm doing now. Developed it in Django and see if a more reliable
environment to go.

Again, thanks for the great resources. Really appreciate it.

Regards,

Feris

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Any Site in Production Use ?

2008-10-01 Thread Feris Thia
Hi Mitnickcbc,

Thank you for your thorough review. Actually I'm going it to use as a kind
of social networking site. It is developed in Django and considering to port
the model to Google DataStore.

But I myself has faced some cpu issue problems, and suspect for others which
you prove me right.

Anyway, then meanwhile I'll have to wait until GAE is release in full
edition.

Thanks again for the reviews and suggestions, really appreciate it.

Regards,

Feris

On Tue, Sep 30, 2008 at 5:05 PM, mitnickcbc <[EMAIL PROTECTED]> wrote:

> If you are building something not going to have heavy load, you can
> use GAE as production without much problem. But surely not business
> application, otherwise your customer will shout at you and you really
> can do nothing to help them
>

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Question about SDK Redistribution

2008-10-01 Thread ericsk

Thanks for your explanation :-)

On 9月30日, 下午9時29分, Wooble <[EMAIL PROTECTED]> wrote:
> I'm not a lawyer, but the SDK is under the Apache license, so yes, you
> should be able to redistribute it as long as you follow the terms of
> the license.  I believe your package can be under just about any
> license you want except GPLv2; the Apache license has language about
> patents that is considered incompatible with that license.
>
> On Sep 30, 3:44 am, ericsk <[EMAIL PROTECTED]> wrote:
>
> > Dear all,
>
> > If I develop an application, am I permitted to include App Engine's
> > SDK in my distributed package?
> > And, should my package declare the same license?
>
> > Thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] For those looking for .NET in the cloud

2008-10-01 Thread Andrew Badera
... and who can't wait for Microsoft Oslo to come out:

http://aws.typepad.com/aws/2008/10/coming-soon-ama.html

-- 
Thanks-
- Andy Badera
- [EMAIL PROTECTED]
- (518) 641-1420

- http://higherefficiency.net
- http://changeroundup.com/

- http://flipbitsnotburgers.blogspot.com/
- http://andrew.badera.us/

- Google me: http://www.google.com/search?q=andrew+badera

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Google App Engine Development Tools (gaedt)

2008-10-01 Thread Sylvain

Hi,

New feature :

You could add an option to add a counter like that :
Ask for the counter's name, then add this counter.py

http://paste.blixt.org/1581 (from Google I/O)

Many people wonder how to add a counter.

Regards


On 1 oct, 14:28, EricWittmann <[EMAIL PROTECTED]> wrote:
> Just a note - we have released version 0.8.5 of gaedt.  Go here for
> the project page:
>
> http://code.google.com/p/gaedt/
>
> Or go here to view the development blog:
>
> http://gaedt-dev.blogspot.com/
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Google App Engine Development Tools (gaedt)

2008-10-01 Thread EricWittmann

Just a note - we have released version 0.8.5 of gaedt.  Go here for
the project page:

http://code.google.com/p/gaedt/

Or go here to view the development blog:

http://gaedt-dev.blogspot.com/


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: production server behavior is different than dev_appserver

2008-10-01 Thread warreninaustintexas

Follow-up: Deleted the autogenerated indexes in the index.yaml file,
reran through dev_appserver enough to rebuild the indexes, re-uploaded
and it works now.

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: bulk export

2008-10-01 Thread I.K.

Excellent.  Thanks for that.

I'm afraid I can't promise I will get an opportunity to try it in the
near future.  My app is still far from being useable and I only get a
couple hours a week to code.

I will feed back any findings to you.  I have worked on an open source
project in the past so, I'm still in the habit of sharing.

Thanks


On Oct 1, 1:34 am, Garrett Davis <[EMAIL PROTECTED]> wrote:
> Yes.  I built a bulk download module. Please read about it 
> here:http://code.google.com/p/gawsh/wiki/BulkDownload
> and download it fromhttp://code.google.com/p/gawsh/downloads/list
>
> If it works for you, please let me know.
> If it doesn't work for you, please let me know that as well.
>
> On Sep 26, 4:24 am, "I.K." <[EMAIL PROTECTED]> wrote:
>
>
>
> > Oops! this was supposed to be a reply to bulk upload post, but the
> > question is still valid.
>
> > On Sep 26, 12:18 pm, "I.K." <[EMAIL PROTECTED]> wrote:
>
> > > Hi,
>
> > > Sorry to piggy-back off a slightly different topic, but it may have
> > > the right audience.
>
> > > I am toying with a mashup of google spreadsheets and the GAE, which
> > > will partly involve pumping data back and forth.  You have made me
> > > aware of the bulkloader which should be a useful way to get a lot of
> > > data into GAE, however does anybody ave a suitably efficient version
> > > of the reverse? ie getting a lot of data out of datastore in csv,
> > > without just coding it yourself?
>
> > > thanks- Hide quoted text -
>
> - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Still having trouble with high-cpu warnings for thumbnails.

2008-10-01 Thread iceanfire

I will try removing the debug stuff, maybe that is what's causing this
problem. I'll let you know if this helps.

Thanks.

On Sep 29, 6:44 am, Sylvain <[EMAIL PROTECTED]> wrote:
> Hi,
>
> memcache can decrease the average, that's all.
> memcache is really good, but it can't be the answer for everything.
>
> Else, I think with this CPU/Warning, there is no "good" solution. Now,
> I'm just waiting for higher quota (free or not).
>
> Did you remove all "DEBUG" flag (webapp, django render option,...)
> because it uses a lot of CPU.
> The profiler code uses a lot of CPU too.
>
> Regards
>
> On 29 sep, 12:57, Arun Shanker Prasad <[EMAIL PROTECTED]>
> wrote:
>
> > Hi Barry,
>
> > I know that the memcache is free, but it will push out stuff if the
> > memory usage is high, I already have much bigger query result cached
> > in the memcache, I want to keep them there as long as possible. :)
>
> > Thanks,
> > Arun Shanker Prasad.
>
> > On Sep 29, 3:44 pm, "Barry Hunter" <[EMAIL PROTECTED]>
> > wrote:
>
> > > On Mon, Sep 29, 2008 at 11:18 AM, Arun Shanker Prasad
>
> > > <[EMAIL PROTECTED]> wrote:
>
> > > > My app is also causing the same problems, I have used etags to set the
> > > > response to 302 if cached, I've tried everything short of memcache, I
> > > > have many images, I don't think that is a viable solution for me.
>
> > > Why not? memcache is 'free', doesn't seem to be any reason NOT to use it?
>
> > > memcache is already designed to keep 'hot' items in the cache - so it
> > > will automatically discard little used images (or what ever you
> > > store).
>
> > > You should I would of thought be looking to cache everything possible.
>
> > > > Any suggestions are welcome.
>
> > > > On Sep 29, 2:10 am, iceanfire <[EMAIL PROTECTED]> wrote:
> > > >> Thanks for the suggestions. I see the need for Cacheing to reduce the
> > > >> load, but I don't understand why the current request is causing high-
> > > >> cpu warnings (2 times the average cpu request). At this rate, if a few
> > > >> first time users use my application & try to open images that haven't
> > > >> been memcached (etc..).. then my application will crash.
>
> > > >> My main question is: why is this request causing a high cpu warning,
> > > >> and no one has been able to answer that. For a request that only
> > > >> takes  "0.020 CPU seconds" (according to profiler), appengine sends
> > > >> out a warning stating that 2463mcycles have been used.
>
> > > >> So my question is: how do I completely stop high-cpu warnings for a
> > > >> request that shouldn't be causing them in the first place?
>
> > > >> I understand that there are ways I can mitigate the problem..but i'd
> > > >> like to get to the root if possible.
>
> > > >> thanks!
>
> > > >> On Sep 25, 9:26 am, "Bryan A. Pendleton" <[EMAIL PROTECTED]> wrote:
>
> > > >> >     -So are textProperties more efficient than StringProperties
> > > >> > because
> > > >> >     they're not indexed?
>
> > > >> >     You'd have to find the talk from Google IO to be sure. I believe
> > > >> > it
> > > >> >     was the one about scalability, in the QA section. But yes, that 
> > > >> > is
> > > >> > my
> > > >> >     understanding.
>
> > > >> > As I understand it, every field that's not a TextProperty or a
> > > >> > BlobProperty are implicitly indexed (this is how all = conditions are
> > > >> > dealt with in queries). So, whenever you write such an object, it 
> > > >> > will
> > > >> > take longer (because of the index updates).
>
> > > >> > Another way of thinking about it, is that if you never need to query
> > > >> > on a single value, make it a TextProperty or BlobProperty, if
> > > >> > possible.
>
> > > >> >     -Wouldn't adding etag--while increasing efficiency if I have the
> > > >> > same
> > > >> >     users loading the sameimageagain and again--actually decrease
> > > >> >     efficiency for users who are opening up an thumbnail for the 
> > > >> > first
> > > >> >     time? In that situation,  I'd have another column for etags in my
> > > >> >     datastore being requested w/ every query.
>
> > > >> >     Yes, you absolutely should generate the etag when you save the
> > > >> >     thumbnail, and save it in the model itself.Cachingit separately
> > > >> > is
> > > >> >     however still desirable as you can then avoid pulling the rest of
> > > >> > the
> > > >> >     data into memory if it's not needed, or you can opt to not cache
> > > >> > the
> > > >> >     rest of the data at all, instead onlycachingthe etag, to be more
> > > >> >     cache friendly.
>
> > > >> > A quick and easy hack for this is to generate the etag before 
> > > >> > creating
> > > >> > the Thumbnail model instance - and use that etag as the named key.
> > > >> > Then, you can do lookup andcachingbased on the etag alone, where
> > > >> > that makes sense. Unless you have some specific meaning in your ID
> > > >> > already, this should simplify the "how to deal with etags" question
> > > >> > quite a bit.
>
> > > --
> > > Barry
>

[google-appengine] production server behavior is different than dev_appserver

2008-10-01 Thread warreninaustintexas

This is the second time that I have uploaded an application to the
production server and had errors for an app that was working fine on
dev_appserver.  Here's the portion of code supposedly in error:

recents = db.GqlQuery("SELECT * FROM Recent ORDER BY
time_date_stamp DESC LIMIT 10")
isnorecents = False
if recents.count() ==0:
  isnorecents = True

The error message lists the recents.count() line (along with errors on
Google's datastore.py code).  This app works fine in the development
environment - even with a completely empty datastore - but crashes on
the production server.

Is there some way to determine what will error out in the production
environment?  Isn't the point of having a development environment to
simulate the behavior of the production environment?

warreninaustintexas
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---