[google-appengine] Re: Datastore usage ~ 80 times more than expected (Add your vote to a datastore usage accounting feature)

2009-04-30 Thread Kugutsumen

I've sent you my ID.

Thanks for looking into this.

On Apr 29, 4:06 am, Jason (Google) apija...@google.com wrote:
 Can you both provide your application IDs so I can investigate a bit?

 Thanks,
 - Jason

 On Sat, Apr 25, 2009 at 12:33 PM, Kugutsumen kugutsu...@gmail.com wrote:

  On Apr 23, 4:47 am, Panos pa...@acm.org wrote:
   I have also been puzzled at times on where the space is going. I filed
   this request today:

   More granular accounting of how datastore space is used
 http://code.google.com/p/googleappengine/issues/detail?id=1396

   Please browse to the issue and add your vote/star if you want to see
   this feature implemented.

   Panos

  I also think there is something wrong.

  I have 2.3M Domain records and the source CSV is only 63 megabytes,
  no composite index. The dashboard claims I am using 3GB !?!
  (3.03 of 101.00 GBytes)

  This is my base expando model:

  class Domain(db.Expando):
   name = db.StringProperty(required=True, verbose_name='FQDN')
   revname = db.StringProperty(verbose_name='Reverse FQDN')
   since = db.DateTimeProperty(auto_now_add=True)

  I am ready to upload 102M more records, I guess I am going to wait
  until this issue is resolved.


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Bulk uploading to App Engine is faster than you think

2009-04-30 Thread Kugutsumen

http://blog.nkill.com/2009/04/bulk-uploading-to-app-engine-is-faster.html

One of the things that really worried me when I started porting the
nkill project to App Engine was the speed at which I could upload data
to the app engine datastore.

I kept seeing threads indicating that it was a slow and painful
process.
Luckily, bulkupload.py isn't bad at all! I suspect that the bottleneck
is upload speed. Typically home users have asymnetric bandwidth
wherein the download speed is significantly higher than the upload
speed which is typically capped at 256-512 kbit/s.

Here are some stats:

2332970 entities in 21112.7 seconds (that's 2.3M in about 5 hours and
110 entities per second.)

I split my input CSV into 10,000 line files and used the following
bulkloader.py (SDK 1.2.0) options:

--rps_limit=250 --batch_size=50

I am pretty sure there is room for improvement. I tried to use
conservative values to minimize CPU usage and stay under the quota
radar (I still managed to get in the red).

The following parameters will affect the speed at which you can
upload:

--batch_size= (max 500)
--num_threads= (default 10)
--rps_limit= (default 20)
--http_limit= (default 8)

I'll do a follow-up post since I have several million records to
upload. Hopefully I'll find the sweet spot.

K.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: App Engine for Java - Custom Domain Name - Server not found

2009-04-30 Thread danoro

Hi Jason,

I posted the question again and I'm following with Don Schwarz here

http://groups.google.com/group/google-appengine-java/browse_thread/thread/7541a94c0d5895b7

The issue has turned into a discussion about DNS cache services and
whether Google App Engine can or not provide a fix ip address and
whether this would solve DNS cache issues...

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Some design Issues in appengine datastore

2009-04-30 Thread vijay
Hello All,I am working on an application and got stuck in design phase I
hope you guyz can help me out. I have several doubts related to performance
and modelling.

1#
In my application i store some hierarchical data and not sure how to do it.

for e.g. say you have data organized as

food
  |fruit
  | |___red---Ironapples
  | |  |   |something
  | |  |
  | |  |___Viatamins  litchies
  | |
  | |___greenIron-apples
  ||_Vitamin E - guava
  |
  |vegetable---category1 -category2category3
  |  |
  |  | |
  |  |
  | ||
  |fried|___
 |__similarly here

Is there a clean way to store this kind of data in appengine. I mean what
should be classes , their properties etc.

In my case the sub categories can upto a depth of 10, with each level having
100s of categories.
I will be executing queries to lookup by any node, so basically i can do
lookup by fruit or apple or food as a whole.

2#
My applcation is going to have a search box where user will be writing his
search items, I would like to suggest them the correct word if they do a
spelling mistake, so if they type frut i wil suggest them to type fruit.
The way i am thinking of implementing it is to for each word entered do a
comparison with each node element with a relaxation of 2, so if the word
matched except at two points i will go ahead and suggest it. I think as i
need to have sequential traversal of all the nodes, which may take a lot of
time and resources, I should store it somewhere to do faster lookup like in
memcache, what are your suggestions.


3#
Are there any standard APIs to send sms from application. I asked about this
in other thread but haven't got any reply, I hope I will get some answers
this time.

I am quiet new to web site designing and any other suggestion or some
pointers to general design principles would be great.


Regards,
Vijay

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Completely clearing the datastore

2009-04-30 Thread 风笑雪
Just like this:
threads = []

for i in xrange(10):
  threads.append(threading.Thread(target=add)) # add is a function to add
some data

for thread in threads:
  thread.start()

2009/4/30 Sri sri.pan...@gmail.com


 how did you create the threads??  did you create and destroy them each
 time or did you mantain a pool/manager ??

 On Apr 30, 3:58 pm, 风笑雪 kea...@gmail.com wrote:
  I only tested insert and delete.
  And they works slower( maybe because of including threads' starting time
 ).
 
  2009/4/30 Sri sri.pan...@gmail.com
 
 
 
   actually ive tried using multiple threads..
 
   i found uploads were faster ... deletes on (non overlapping) data was
   someone what similar..
 
   On Apr 29, 10:37 pm, 风笑雪 kea...@gmail.com wrote:
In my test, using multiple threads is the same speed as 1 thread.
 
2009/4/29 Sri sri.pan...@gmail.com
 
 Actually Ive started doing multiple threads couple of nights ago
 and
 it was pretty fast...
 
 Same applied with uploading new data.  Ofcourse now I am just out
 of
 quota... :D
 
 thanks for the tips guys.
 
 cheers
 Sri
 
 On Apr 28, 4:37 pm, Alkis Evlogimenos ('Αλκης Ευλογημένος)
 evlogime...@gmail.com wrote:
  Yes but you can hit them repeatedly and from multiple threads
 until
  everything is deleted.
 
  2009/4/28 Sri sri.pan...@gmail.com
 
   Right you mean have handlers that delete data instead of using
   remote_api?
 
   But wouldnt that limit my requests to 30 seconds (well i guess
 I
   could
   1000 delete 100 items  requests) right?
 
   On Apr 27, 5:09 pm, Alkis Evlogimenos ('Αλκης Ευλογημένος)
   evlogime...@gmail.com wrote:
If you don't its better to do it on the server side rather
than transferring data through the net.
No you don't need to keep the keys locally.
 
2009/4/27 Sri sri.pan...@gmail.com
 
 But the issue is that i dont really have the keys on me...
 does
 that
 mean that each time i load the datastore il have to keep
 track
   of
 the
 keys as well locally.. so that when i want to clear them i
 can
   use
 them..
 
 cheers
 Sri
 
 On Apr 27, 8:50 am, Alkis Evlogimenos ('Αλκης Ευλογημένος)
 evlogime...@gmail.com wrote:
  The sample code does:
  MyModel.all().fetch(1000)
 
  This means fetch 1000 entities of MyModel. If each entity
 is
   10kb
   this
 means
  10MB of data read from datastore, 10MB of data sent
 through
   the
   network
 to
  your running instance and 10MB of data server from the
   running
   instance
 to
  your machine running the remote script.
 
  If you know the keys then you can do:
 
  db.delete([db.Key.from_path('MyModel', key_name) for
 key_name
   in
  one_thousand_key_names])
 
  This just sends the keys to the datastore for deletion.
 It
 doesn't
   need
 to
  transfer data from the datastore to the remote script to
 read
   the
   keys in
  the first place.
 
  Eventually GAE api should provide us some way of querying
 the
   datastore
 for
  keys only instead of getting entities necessarily. This
 would
 make
   this
  use-case quite a bit faster and a lot of others as well.
 
  2009/4/26 Devel63 danstic...@gmail.com
 
   Can you explain this further?  I don't see any
 reference to
   key_name
   in the sample code.
 
   More importantly, to me, what's the cost differential
   between
 using
   string representation of keys and key_names?  I've been
   passing
   around
   key_names to the browser because they're shorter, under
 the
   assumption
   that the cost to get the corresponding key on the
 server
   side
 was
   negligible.
 
   On Apr 25, 9:02 am, Alkis Evlogimenos ('Αλκης
 Ευλογημένος)
   evlogime...@gmail.com wrote:
Doing it over the remote api means you are going to
   transfer
 all
   your
   data +
transmission overhead over the wire. You are probably
   better
 off
 doing
something like this on the server side through an
 admin
 protected
   handler.
 
Also if you happen to know the keys of your data (you
   used
   key_name)
 your
deletes are going to be a lot more efficient if you
 give
   db.delete a
 list
   of
keys instead.
 
On Sat, Apr 25, 2009 at 2:41 PM, Sri 
   sri.pan...@gmail.com
   wrote:
 
 Hi,
 
Is there a way to completely erase the
 production
   data
   store?
 
 Currently I am using a script like this via the
 remote
   api:
 
 def delete_all_objects(obj_class):
num_del = 300
   

[google-appengine] Re: Completely clearing the datastore

2009-04-30 Thread Sri

how did you create the threads??  did you create and destroy them each
time or did you mantain a pool/manager ??

On Apr 30, 3:58 pm, 风笑雪 kea...@gmail.com wrote:
 I only tested insert and delete.
 And they works slower( maybe because of including threads' starting time ).

 2009/4/30 Sri sri.pan...@gmail.com



  actually ive tried using multiple threads..

  i found uploads were faster ... deletes on (non overlapping) data was
  someone what similar..

  On Apr 29, 10:37 pm, 风笑雪 kea...@gmail.com wrote:
   In my test, using multiple threads is the same speed as 1 thread.

   2009/4/29 Sri sri.pan...@gmail.com

Actually Ive started doing multiple threads couple of nights ago and
it was pretty fast...

Same applied with uploading new data.  Ofcourse now I am just out of
quota... :D

thanks for the tips guys.

cheers
Sri

On Apr 28, 4:37 pm, Alkis Evlogimenos ('Αλκης Ευλογημένος)
evlogime...@gmail.com wrote:
 Yes but you can hit them repeatedly and from multiple threads until
 everything is deleted.

 2009/4/28 Sri sri.pan...@gmail.com

  Right you mean have handlers that delete data instead of using
  remote_api?

  But wouldnt that limit my requests to 30 seconds (well i guess I
  could
  1000 delete 100 items  requests) right?

  On Apr 27, 5:09 pm, Alkis Evlogimenos ('Αλκης Ευλογημένος)
  evlogime...@gmail.com wrote:
   If you don't its better to do it on the server side rather
   than transferring data through the net.
   No you don't need to keep the keys locally.

   2009/4/27 Sri sri.pan...@gmail.com

But the issue is that i dont really have the keys on me... does
that
mean that each time i load the datastore il have to keep track
  of
the
keys as well locally.. so that when i want to clear them i can
  use
them..

cheers
Sri

On Apr 27, 8:50 am, Alkis Evlogimenos ('Αλκης Ευλογημένος)
evlogime...@gmail.com wrote:
 The sample code does:
 MyModel.all().fetch(1000)

 This means fetch 1000 entities of MyModel. If each entity is
  10kb
  this
means
 10MB of data read from datastore, 10MB of data sent through
  the
  network
to
 your running instance and 10MB of data server from the
  running
  instance
to
 your machine running the remote script.

 If you know the keys then you can do:

 db.delete([db.Key.from_path('MyModel', key_name) for key_name
  in
 one_thousand_key_names])

 This just sends the keys to the datastore for deletion. It
doesn't
  need
to
 transfer data from the datastore to the remote script to read
  the
  keys in
 the first place.

 Eventually GAE api should provide us some way of querying the
  datastore
for
 keys only instead of getting entities necessarily. This would
make
  this
 use-case quite a bit faster and a lot of others as well.

 2009/4/26 Devel63 danstic...@gmail.com

  Can you explain this further?  I don't see any reference to
  key_name
  in the sample code.

  More importantly, to me, what's the cost differential
  between
using
  string representation of keys and key_names?  I've been
  passing
  around
  key_names to the browser because they're shorter, under the
  assumption
  that the cost to get the corresponding key on the server
  side
was
  negligible.

  On Apr 25, 9:02 am, Alkis Evlogimenos ('Αλκης Ευλογημένος)
  evlogime...@gmail.com wrote:
   Doing it over the remote api means you are going to
  transfer
all
  your
  data +
   transmission overhead over the wire. You are probably
  better
off
doing
   something like this on the server side through an admin
protected
  handler.

   Also if you happen to know the keys of your data (you
  used
  key_name)
your
   deletes are going to be a lot more efficient if you give
  db.delete a
list
  of
   keys instead.

   On Sat, Apr 25, 2009 at 2:41 PM, Sri 
  sri.pan...@gmail.com
  wrote:

Hi,

   Is there a way to completely erase the production
  data
  store?

Currently I am using a script like this via the remote
  api:

def delete_all_objects(obj_class):
   num_del = 300
   while True:
       try:
           objs = obj_class.all().fetch(1000)
           num_objs = len(objs)
           if num_objs == 0:
               return
           print Deleting %d/%d objects of class %s %
  (num_del,
num_objs, str(obj_class))
           db.delete(objs[:num_del])

[google-appengine] Question about exploding index sizes

2009-04-30 Thread Morten Bek Ditlevsen
Hi there,

I have an application with an entity containing a list of geoboxes for
geographic querying. I currently generate 28 geobox entries in this list.

Since the list property is being queried along with other properties, this
causes 28 index entries to be updated whenever I update a value that is part
of the index.

My problem is that now I would like to query additional lists at the same
time - causing my index to 'explode'.

My question is: are there any recommendations with regards to how many index
entries a single change should cause to be updated?

I would like to have (pseudo) full text search of a field and thought of
doing this by adding a list of words to be queried.
If this list is 50 items long I will now have to update 28*50 indexes for
each change, right?

Is that possible at all, or should any kind of exploding index sizes be
avoided?

The application is a location based dating service. Right now location
lookups are working ok, but I am wondering whether it is feasible to have
location based lookups paired with text search at all - or if some of my
processing should be in python code instead of done through table queries.

Sincerely,
/morten

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Question about exploding index sizes

2009-04-30 Thread Nick Johnson (Google)

Hi Morten,

On Thu, Apr 30, 2009 at 2:05 PM, Morten Bek Ditlevsen
morten@gmail.com wrote:
 I have an application with an entity containing a list of geoboxes for
 geographic querying. I currently generate 28 geobox entries in this list.

 Since the list property is being queried along with other properties, this
 causes 28 index entries to be updated whenever I update a value that is part
 of the index.

 My problem is that now I would like to query additional lists at the same
 time - causing my index to 'explode'.

 My question is: are there any recommendations with regards to how many index
 entries a single change should cause to be updated?

'As few as possible'. :)

Besides the hard limit, every additional index update increases the
latency and CPU costs for updating data for that entity type.


 I would like to have (pseudo) full text search of a field and thought of
 doing this by adding a list of words to be queried.
 If this list is 50 items long I will now have to update 28*50 indexes for
 each change, right?

That's correct.


 Is that possible at all, or should any kind of exploding index sizes be
 avoided?

With the numbers you give, yes, it's possible, but 1400 index entries
is something you really want to avoid if you can.


 The application is a location based dating service. Right now location
 lookups are working ok, but I am wondering whether it is feasible to have
 location based lookups paired with text search at all - or if some of my
 processing should be in python code instead of done through table queries.

I would suggest refactoring. You can make the inverted index for the
text a separate entity, or you can do that for the geoboxes,  or you
can take a Hybrid approach: For example, you could have an entity that
stores a top-level approximation of the user's location (eg, Country,
State, or whatever is the highest level at which someone will query if
they're specifying a geographical bound) in addition to the keywords
in order to narrow the search down.

-Nick Johnson

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: django loaddata fails

2009-04-30 Thread notcourage

Happens when I omit the app name too.
DeserializationError: Invalid model identifier: 'period'

On Apr 29, 9:37 pm, Tim Hoffman zutes...@gmail.com wrote:
 Reading the traceback I think the '-' in the name might be causing you
 a problem
 You would not be able to have a python name (ie module name variable
 etc..)
 with '-'  . The parser won't be able distinguish between - operator /
 expression and your name.

 That might be it

 T

 On Apr 30, 11:07 am, notcourage klr...@gmail.com wrote:

  I confess I haven't used fixtures before. My hierarchy is:

  collectrium-splash
      models.py: Contains the model Period.
      fixtures
          period.yaml

  First of all, it seems odd that:

  python manage.py loaddata period.yaml

  doesn't find the fixtures directory so I tried:

  python manage.py loaddata fixtures/period.yaml

  which yields this stk trace:

  WARNING:root:Could not read datastore data from /tmp/
  django_collectrium-splash.datastore
  WARNING:root:Could not read datastore data from /tmp/
  django_collectrium-splash.datastore.history
  INFO:root:zipimporter('/home/notcourage/swe/collectrium-splash/
  django.zip', 'django/core/serializers/')
  Installing yaml fixture 'fixtures/period' from absolute path.
  Problem installing fixture 'fixtures/period.yaml': Traceback (most
  recent call last):
    File /home/notcourage/swe/collectrium-splash/django.zip/django/core/
  management/commands/loaddata.py, line 116, in handle
      for obj in objects:
    File /home/notcourage/swe/collectrium-splash/django.zip/django/core/
  serializers/pyyaml.py, line 49, in Deserializer
      for obj in PythonDeserializer(yaml.load(stream)):
    File /home/notcourage/swe/collectrium-splash/appengine_django/
  serializer/python.py, line 59, in Deserializer
      Model = python._get_model(d[model])
    File /home/notcourage/swe/collectrium-splash/django.zip/django/core/
  serializers/python.py, line 107, in _get_model
      raise base.DeserializationError(uInvalid model identifier: '%s'
  % model_identifier)
  DeserializationError: Invalid model identifier: 'collectrium-
  splash.period'

  Thx for your help.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Question about exploding index sizes

2009-04-30 Thread Morten Bek Ditlevsen
Hi Nick,

Thanks a bunch! I'm really amazed that I can just throw out a question like
this and have a googler reply within minutes! :-)



  The application is a location based dating service. Right now location
  lookups are working ok, but I am wondering whether it is feasible to have
  location based lookups paired with text search at all - or if some of my
  processing should be in python code instead of done through table
 queries.

 I would suggest refactoring. You can make the inverted index for the
 text a separate entity, or you can do that for the geoboxes,  or you
 can take a Hybrid approach: For example, you could have an entity that
 stores a top-level approximation of the user's location (eg, Country,
 State, or whatever is the highest level at which someone will query if
 they're specifying a geographical bound) in addition to the keywords
 in order to narrow the search down.


Just to make sure I understand this right (well, rather - I don't think I
get it 100%):

By the inverted index entity you mean a new entity kind where I store word
lists and then reference this entity from my 'user profile' entity?

This will allow me to create a query that fetches a list of entities for
which I can find the referring 'user profile' entities.

But how can I combine that result with other queries?



The same goes for the hybrid example. I see how this can be used to give me
a subset, but can that subset be queried any further?

Please excuse my ignorance - I feel there's a part I'm failing to
understand.


Sincerely,
/morten

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Question about exploding index sizes

2009-04-30 Thread Nick Johnson (Google)

On Thu, Apr 30, 2009 at 3:06 PM, Morten Bek Ditlevsen
morten@gmail.com wrote:
 Hi Nick,

 Thanks a bunch! I'm really amazed that I can just throw out a question like
 this and have a googler reply within minutes! :-)


 
  The application is a location based dating service. Right now location
  lookups are working ok, but I am wondering whether it is feasible to
  have
  location based lookups paired with text search at all - or if some of my
  processing should be in python code instead of done through table
  queries.

 I would suggest refactoring. You can make the inverted index for the
 text a separate entity, or you can do that for the geoboxes,  or you
 can take a Hybrid approach: For example, you could have an entity that
 stores a top-level approximation of the user's location (eg, Country,
 State, or whatever is the highest level at which someone will query if
 they're specifying a geographical bound) in addition to the keywords
 in order to narrow the search down.

 Just to make sure I understand this right (well, rather - I don't think I
 get it 100%):

 By the inverted index entity you mean a new entity kind where I store word
 lists and then reference this entity from my 'user profile' entity?

Yes. The list of words used for fulltext indexing is called an 'inverted index'.


 This will allow me to create a query that fetches a list of entities for
 which I can find the referring 'user profile' entities.

 But how can I combine that result with other queries?

You need a (fairly simple) query planner, that can decide to either
perform a text or bounding box query, then filter (in user code) the
results for those that match the other filter. You decide which to
pick based on which you think will have the fewest results (for
example, a query for a small area will take preference over a search
for the string 'friendly', whilst a search for the string 'solipsist'
should probably be used instead of a query for all but the tiniest
areas).

 The same goes for the hybrid example. I see how this can be used to give me
 a subset, but can that subset be queried any further?

In that case, you can (hopefully) assume that the number of results
for your keywords in your geographical area is small enough that you
can filter them manually, without the need for explicit query
planning. You can also use 2 or more levels of geographical nesting -
just fewer than your main index, to keep the index entry count under
control.

-Nick Johnson

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Still no full-text search? Mystified by the priorities.

2009-04-30 Thread pran__

+1, it would be great if Search API could be provided. I have been
using the Searchable Model for quite sometime, and it fits to my basic
needs, but i know a search engine that has made the expectations of
people really high, as soon as they see a search box :-)

--
Pranav Prakash

On Apr 30, 8:45 am, Lee Olayvar leeolay...@gmail.com wrote:
 Fully agree. The fact that its not out yet is surprising, the fact that its
 not even on the roadmap is simply jaw dropping bizarre.

 On Wed, Apr 8, 2009 at 9:15 AM, Jonathan Feinberg e.e.c...@gmail.comwrote:





  Long ago I attracted a flame-fest when I expressed my opinion that
  adding support for other programming languages should be given less
  priority than fixing bugs and adding infrastructural features. Here we
  are, months later, and the big announcements are

  1) Java (my God, why?)

     and

  2) Cron jobs (...but I could already write cron jobs to hit a URL)

  In the meantime, full-text search is not even on the roadmap.

  I'm torn. As the creator of Wordle, I'm truly grateful to Google and
  the GAE team for the use of an automatically-scaling app
  infrastructure. It has been a pleasure to use. On the other hand, the
  lack of search has been a huge problem for Wordle users, and I've got
  no good options.

  I acknowledge that search is my pet issue; I don't claim to represent
  a community or interest group with these comments. Then again, I can't
  think of a CRUD-style app that doesn't require or benefit from text
  search. So, while I'd consider using GAE in the future for some
  stateless utility micro-site, or maybe a static site, I won't use it
  again for anything with user-created data. While I've begun to regret
  having used it for Wordle, I admit that it's my own fault for not
  having thought through the implications of having no full-text search
  available.

 --
 Lee Olayvar
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] I need help with passing variable values from datastore to html files.

2009-04-30 Thread Aaron

the subject  says it all.  I am working on  a ad rotation system.  I
have to grab the url data of the ad from the datastore database and
then  send that data to a html file to put that url  in a  html image
tag.

any ideas how I should go abouts doing  this?  Here is what I have so
far:


here is the views.py code:


def get_ad(request):
  total_entries = Ad.all().count()
  ad_all = Ad.all()[:total_entries]
  ad = random.choice(ad_all)
  return HttpResponse(ad.adurl)


here is the models.py  code


class Ad(db.Model):
   name = db.StringProperty()
   adurl = db.StringProperty()
   number= db.IntegerProperty()


Here is the html code :

div class=adimg src={{ad_url}}/div

any ideas?

currently with the code I show I still  see no ads showing up. I even
tried printing some variables and I s till see nothing getting
fetched.

any ideas what I should do?

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Strange behaviour

2009-04-30 Thread Nora

Hello,
Googlebot indexed one page in my website but something that I can't
understand is happening
What I know about indexing is that if you search for any word
contained in the page that has been indexed, this page should be
displayed in the search results.  For example, If my page contains the
word 'Hello' and I search for the word 'Hello', shouldn't my page(that
has been indexed and contains this page)be included in the search
results?

Can anyone advise me on what is going wrong there!

Thank you very much,
Nora
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Using ReportLab with ZipImport

2009-04-30 Thread Ruud Helderman

Hi Arun,
Sorry for my earlier post arriving 'a day after the fair', it got
delayed by moderator policy (check its timestamp).

This is a generic problem - not something specific to either ReportLab
or Pisa. Zipping a library is a matter of choice during deployment -
why should that affect the source code of each module depending on the
library?

In Python documentation, I noticed 2 ways to globally expand sys.path,
without having to adjust individual .py files:
1. PYTHONPATH environment variable
2. .pth files

However, I doubt if either one is available to GAE application
developers. If not, then adjusting the Pisa sources is your only
option, sorry.

IMHO, zipimporter is nice, but immature - I would rather have a more
transparent approach. An API layer on top of the filesystem that hides
the difference between ZIP files and 'real' folders (much like
'compressed folders' in Windows), not only for import statements but
for regular file access as well.


On Apr 28, 6:31 am, Arun Shanker Prasad arunshankerpra...@gmail.com
wrote:
 Hi Ruud,

 Thanks for the patch, I mentioned your patch comment #5. It is great
 and works fine when used with the ReportLab package alone.

 My problem is that I use ReportLab as a requirement for the Pisa
 package,
 (http://www.xhtml2pdf.com/doc/pisa-en.html,http://pypi.python.org/pypi/pisa/)

 Since reportlab is called from inside these packages, they are not
 able to find the reportlab package. Do you know of any work around for
 this?

 Thanks,
 Arun Shanker Prasad.

 On Apr 25, 7:07 pm, Ruud Helderman liping...@hotmail.com wrote:

  I patched the ReportLab library, works fine for me.

  For a simple demo and full source code, look 
  here:http://ruudhelderman.appspot.com/testpdf

  I added a comment with detailed explanation to issue 
  1085:http://code.google.com/p/googleappengine/issues/detail?id=1085
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Cron schedule : one day per month ?

2009-04-30 Thread 风笑雪
You can let your cron job run every day.
And in your job, you should check the date. If today is not 18th day of
month, then exit without doing anything.

2009/4/30 Sylvain sylvain.viv...@gmail.com


 Hi,

 I need to create a job that runs each 18th of month (i.e : 18th
 january, 18th february, 18th march,...)
 Is there a simple schedule syntax to do that ?

 Currently, i can do something like that :
 schedule: every
 monday,tuesday,wednesday,thursday,friday,saturday,sunday of month
 09:00
 and test each day if we are the 18th

 But is there a schedule syntax to do that directly ?

 Thank you

 Regards.
 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Using ReportLab with ZipImport

2009-04-30 Thread Arun Shanker Prasad

Hi Ruud,

Thanks for the reply. I did notice the timestamp now :)

I think using the PYTHONPATH and setting .pth is a no go in GAE, as
far as I know..
I was afraid of that, might have to get started in the Pisa code..

Thanks again for the solution, I saw some people using ReportLab in
GAE, will help them a lot.

Thanks,
Arun Shanker Prasad.

On Apr 30, 11:16 pm, Ruud Helderman liping...@hotmail.com wrote:
 Hi Arun,
 Sorry for my earlier post arriving 'a day after the fair', it got
 delayed by moderator policy (check its timestamp).

 This is a generic problem - not something specific to either ReportLab
 or Pisa. Zipping a library is a matter of choice during deployment -
 why should that affect the source code of each module depending on the
 library?

 In Python documentation, I noticed 2 ways to globally expand sys.path,
 without having to adjust individual .py files:
 1. PYTHONPATH environment variable
 2. .pth files

 However, I doubt if either one is available to GAE application
 developers. If not, then adjusting the Pisa sources is your only
 option, sorry.

 IMHO, zipimporter is nice, but immature - I would rather have a more
 transparent approach. An API layer on top of the filesystem that hides
 the difference between ZIP files and 'real' folders (much like
 'compressed folders' in Windows), not only for import statements but
 for regular file access as well.

 On Apr 28, 6:31 am, Arun Shanker Prasad arunshankerpra...@gmail.com
 wrote:



  Hi Ruud,

  Thanks for the patch, I mentioned your patch comment #5. It is great
  and works fine when used with the ReportLab package alone.

  My problem is that I use ReportLab as a requirement for the Pisa
  package,
  (http://www.xhtml2pdf.com/doc/pisa-en.html,http://pypi.python.org/pypi...)

  Since reportlab is called from inside these packages, they are not
  able to find the reportlab package. Do you know of any work around for
  this?

  Thanks,
  Arun Shanker Prasad.

  On Apr 25, 7:07 pm, Ruud Helderman liping...@hotmail.com wrote:

   I patched the ReportLab library, works fine for me.

   For a simple demo and full source code, look 
   here:http://ruudhelderman.appspot.com/testpdf

   I added a comment with detailed explanation to issue 
   1085:http://code.google.com/p/googleappengine/issues/detail?id=1085
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Cron schedule : one day per month ?

2009-04-30 Thread 风笑雪
And if you want it runs every day, you do this:
schedule: every 24 hours

2009/5/1 风笑雪 kea...@gmail.com

 You can let your cron job run every day.
 And in your job, you should check the date. If today is not 18th day of
 month, then exit without doing anything.

 2009/4/30 Sylvain sylvain.viv...@gmail.com


 Hi,

 I need to create a job that runs each 18th of month (i.e : 18th
 january, 18th february, 18th march,...)
 Is there a simple schedule syntax to do that ?

 Currently, i can do something like that :
 schedule: every
 monday,tuesday,wednesday,thursday,friday,saturday,sunday of month
 09:00
 and test each day if we are the 18th

 But is there a schedule syntax to do that directly ?

 Thank you

 Regards.
 



--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Cron schedule : one day per month ?

2009-04-30 Thread Sylvain

Thank you for your answer. I will use every 24 hours.

I think I will fill a request for that.

Regards.

On 30 avr, 20:32, 风笑雪 kea...@gmail.com wrote:
 And if you want it runs every day, you do this:
 schedule: every 24 hours

 2009/5/1 风笑雪 kea...@gmail.com

  You can let your cron job run every day.
  And in your job, you should check the date. If today is not 18th day of
  month, then exit without doing anything.

  2009/4/30 Sylvain sylvain.viv...@gmail.com

  Hi,

  I need to create a job that runs each 18th of month (i.e : 18th
  january, 18th february, 18th march,...)
  Is there a simple schedule syntax to do that ?

  Currently, i can do something like that :
  schedule: every
  monday,tuesday,wednesday,thursday,friday,saturday,sunday of month
  09:00
  and test each day if we are the 18th

  But is there a schedule syntax to do that directly ?

  Thank you

  Regards.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Activating GAE (Java) account issues

2009-04-30 Thread Chander Pechetty

+1, I am waiting for the email as well for my account to be activated.


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Timeout: datastore timeout: operation took too long.

2009-04-30 Thread barabaka

Well, I've read a lot of posts about google datastore and the problems
with batch operations, relational approach to arrange data in bigtable
etc. but I always thought the problem wasn't in datastore itself but
in the way people use it. Now I can see with my experience that it
acts just in an unpredictable way. I deployed a test java app that
tries to clear 500 (guaranteed amount!) entries per request. All
entries are in the same entity group and delete is executed in batch
in single transaction. All operations are executed with low level API
so no possible overhead is involved. Here is a sample code and logs:

Code (cut):
=
Query q = new Query(World.class.getSimpleName()); // create query
IteratorEntity i = datastoreService.prepare(q).asIterator();
idx = 0;
while (i.hasNext()  idx500) {
   keys.add(i.next().getKey());
   idx++;
}

// delete keys in batch
Transaction t = datastoreService.beginTransaction();
datastoreService.delete(keys);
t.commit();
==

1st request (all goes well, 500 entries removed)
-
   1.
  I 04-30 07:52AM 02.091 org.itvn.controller.TvnController
clearDbBySize: Reading 500 entity keys...
  See details
   2.
  I 04-30 07:52AM 03.832 org.itvn.controller.TvnController
clearDbBySize: Removing keys by groups, total groups: 1
   3.
  I 04-30 07:52AM 03.832 org.itvn.controller.TvnController
clearDbBySize: Trying to remove 500 entities...
   4.
  I 04-30 07:52AM 07.873 org.itvn.controller.TvnController
clearDbBySize: Removed 500 entities.

2nd request - timeout exception, on READ operation (i.hasNext())
-
   1.
  I 04-30 07:52AM 22.719 org.itvn.controller.TvnController
clearDbBySize: Reading 500 entity keys...
  See details
   2.
  W 04-30 07:52AM 26.551 Nested in
org.springframework.web.util.NestedServletException: Request
processing failed; nested exception is
com.google.appengine.api.datastore.Datas
   3.
  W 04-30 07:52AM 26.552 /clear_db/500
com.google.appengine.api.datastore.DatastoreTimeoutException:
datastore timeout: operation took too long. at
com.google.appengine.api.d
   4.
  C 04-30 07:52AM 26.555 Uncaught exception from servlet
com.google.appengine.api.datastore.DatastoreTimeoutException:
datastore timeout: operation took too long. at com.goog

Here we go, first request executes well, and the next (only a few
seconds later) fails! Note that this is only a test application, with
no load at all. Am I doing something wrong? What's the RELIABLE way to
read/remove 500 entities? Is it a problem with quantity (500)? If so
how much entities could be read without timeout? Can someone give the
reasonable answer to this? If you need more details about app, I can
share this test case in public.

Oleg






--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] How do I set the time between requests for login to Google account?

2009-04-30 Thread Paul Moore

I have a small application which uses Google accounts for login. At
the moment, it prompts fairly regularly (daily, I suspect, although I
haven't done any detailed tests) for me to log in using my Google
account. I'm not setting anything specific, I'm basically using
defaults for pretty much everything. I'd like to extend the login
expiry, so that (like gmail) I'm only asked to resupply my password on
something like a monthly basis.

Is this possible? If so, how do I do it? I've checked the
documentation and done some searches, so assuming this is in the docs
somewhere, a pointer to the bits that I've missed would be great.

Thanks,
Paul

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Index serving not true

2009-04-30 Thread Santiago

Hi,

Yesterday I received this error in a function that queried the
datastore

Traceback (most recent call last):
  File /base/python_lib/versions/1/google/appengine/ext/webapp/
__init__.py, line 501, in __call__
handler.get(*groups)
  File /base/data/home/apps/taksee-2/1.333100565109818741/rest/
appengine.py, line 46, in get
self.dispatch_request()
  File /base/data/home/apps/taksee-2/1.333100565109818741/rest/
appengine.py, line 74, in dispatch_request
obj = self.dispatch(self.request.path, self.request.method,
params)
  File /base/data/home/apps/taksee-2/1.333100565109818741/rest/
__init__.py, line 95, in dispatch
out = action(**params)
  File /base/data/home/apps/taksee-2/1.333100565109818741/cities.py,
line 98, in cities_bulk
return self.get_bulk(City, last_updated, request_id,length)
  File /base/data/home/apps/taksee-2/1.333100565109818741/cities.py,
line 85, in get_bulk
if len(list(all_to_download)) == 0:
  File /base/python_lib/versions/1/google/appengine/ext/db/
__init__.py, line 1468, in next
return self.__model_class.from_entity(self.__iterator.next())
  File /base/python_lib/versions/1/google/appengine/api/
datastore.py, line 1549, in next
self.__buffer = self._Next(self._BUFFER_SIZE)
  File /base/python_lib/versions/1/google/appengine/api/
datastore.py, line 1538, in _Next
raise _ToDatastoreError(err)
  File /base/python_lib/versions/1/google/appengine/api/
datastore.py, line 1965, in _ToDatastoreError
raise errors[err.application_error](err.error_detail)
Timeout: datastore timeout: operation took too long.

I assumed the problem was that there was no index built for the query
so I manually updated the index.yaml file and uploaded it.

- kind: Taxi
  properties:
  - name: status
  - name: last_updated


- kind: City
  properties:
  - name: status
  - name: last_updated

The new index is now in serving status, however, I am still getting
the operation took too long message. I am fairly sure that the
problem is the index because the function is the same for both
entities: Taxi and City. However, it works for the Taxi entity and not
for the City entity.

It has been 20hours since I updated the indexes. Is there any way that
I can know whether the new index has been built?

Thanks,

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] url mapping for url with hash

2009-04-30 Thread puff

I have a URL of the form 
http://localhost:8080/gw/map#lat=42.371227435069805lon=-71.35208129882812zoom=11.
Note that the URL contains a hash.

How do I match this in the URL mapping?  I've tried various variations
of '/gw/map(.*)' without success.

Thanks for any help.

Regards,
puff

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Wildcards in JDO

2009-04-30 Thread Basil Dsouza

Hello,

I am a newbie in JDO (been trying it for about 2 days) and I am facing
writing queries to retrieve data. I dont know if this is a problem
specifically with the google supported implementation of JDO or
something i am doing wrong.

I am trying to write a where containing the equivalent of the LIKE
keyword.
Specifically I am doing:
query.setFilter(firstName.matches(firstNameParam)  lastName.matches
(lastNameParam));

this is what i found on the net, though it doesnt work (http://
www.theserverside.com/tt/articles/article.tss?l=JDOQueryPart1)

The error I get is:
Apr 30, 2009 9:51:27 PM com.google.apphosting.utils.jetty.JettyLogger
warn
WARNING: Nested in org.apache.jasper.JasperException:
org.datanucleus.store.appengine.query.DatastoreQuery
$UnsupportedDatastoreFeatureException: Problem with query SELECT FROM
com.basildsouza.odometer.datalayer.dataobjects.User WHERE
firstName.matches(firstNameParam)  lastName.matches(lastNameParam)
PARAMETERS String firstNameParam, String lastNameParam ORDER BY
lastName asc, firstName asc: Unexpected expression type while parsing
query: org.datanucleus.query.expression.InvokeExpression:
org.datanucleus.store.appengine.query.DatastoreQuery
$UnsupportedDatastoreFeatureException: Problem with query SELECT FROM
com.basildsouza.odometer.datalayer.dataobjects.User WHERE
firstName.matches(firstNameParam)  lastName.matches(lastNameParam)
PARAMETERS String firstNameParam, String lastNameParam ORDER BY
lastName asc, firstName asc: Unexpected expression type while parsing
query: org.datanucleus.query.expression.InvokeExpression


I have tried the like clause directly, and predictably, that didnt
work.

Any clues as to how should I get this working? Or should I be asking
this on a JDO forum?

Thanks and Regards,
Basil Dsouza

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: datastore timeout every time - on reads, not writes

2009-04-30 Thread Liang Zhao

Each request can only run 30 seconds in server side,

but in development server, there is no limitation on it...

On Wed, Apr 29, 2009 at 11:56 PM, tayknight taykni...@gmail.com wrote:

 And, I should add, this works perfecty (and quickly) from the
 development server's datastore.

 On Apr 29, 8:31 am, tayknight taykni...@gmail.com wrote:
 I have a problem. I'm getting datastore timeouts when doing reads. The
 code finished about 5% of the time. The code looks like:

 alerts = Alert.all().filter('expires = ', datetime.datetime.now())
 # ge active alerts
 for alert in alerts:
   #get the db.Keys from the ListProperty
   zones = ZoneMaster.get(alert.zones)
   for zone in zones:
     if zone:
       #get the users for this zone
       if zone.siteusers:
         us = SiteUser.get(zone.siteusers)
           for u in us:
             if u:
               self.response.out.write(u.name + 'br /')

 The Model looks like:
 class Alert(db.Model):
   effective = db.DateTimeProperty()
   expires = db.DateTimeProperty()
   zones = db.ListProperty(db.Key)

 class ZoneMaster(db.Model):
   siteusers = db.ListProperty(db.Key)

 class SiteUser(db.Model):
   name = db.StringProperty()
   zone = db.ReferenceProperty(ZoneMaster)

 This code is repeatably timing out with a Timeout: datastore timeout:
 operation took too long. error.
 I'm not doing any writes. All the reads are by key (that come from a
 ListProperty). Why would this be timing out?

 Thanks.
 




-- 

Cheers!

Liang Zhao

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Invalid runtime specified

2009-04-30 Thread Luca

Hi,

I am following the Using the Google Plugin for Eclipse tutorial. I
have built the very simple hello world application. When I try to
deploy using the built in eclipse deploy feature I get the following:

Creating staging directory
Scanning for jsp files.
Scanning files on local disk.
Initiating update.
Unable to upload:
java.io.IOException: Error posting to URL:
http://appengine.google.com/api/appversion/create?app_id=phenomtekversion=1;
400 Bad Request
Invalid runtime specified.

at com.google.appengine.tools.admin.ServerConnection.send
(ServerConnection.java:114)
at com.google.appengine.tools.admin.ServerConnection.post
(ServerConnection.java:66)
at com.google.appengine.tools.admin.AppVersionUpload.send
(AppVersionUpload.java:345)
at com.google.appengine.tools.admin.AppVersionUpload.beginTransaction
(AppVersionUpload.java:159)
at com.google.appengine.tools.admin.AppVersionUpload.doUpload
(AppVersionUpload.java:68)
at com.google.appengine.tools.admin.AppAdminImpl.update
(AppAdminImpl.java:41)
at com.google.appengine.eclipse.core.proxy.AppEngineBridgeImpl.deploy
(AppEngineBridgeImpl.java:203)
at
com.google.appengine.eclipse.core.deploy.DeployProjectJob.runInWorkspace
(DeployProjectJob.java:97)
at org.eclipse.core.internal.resources.InternalWorkspaceJob.run
(InternalWorkspaceJob.java:38)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
java.io.IOException: Error posting to URL:
http://appengine.google.com/api/appversion/create?app_id=phenomtekversion=1;
400 Bad Request
Invalid runtime specified.

I have regisetered and had a confirmation SMS... Am I waiting for an
email to say the Java Runtime has been deployed for this account?

Or is there some setting I need to make in the project?

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Configuring SDK in Eclipse (Mac)

2009-04-30 Thread dannyr


I'm trying to follow the following guide:
http://code.google.com/eclipse/docs/creating_new_webapp.html.

However, I'm having trouble configuring the location of the SDK for
Google App Engine in Eclipse. I downloaded the Google App Launcher and
tried using that but Eclipse won't recognize it.

Is there anything I'm missing? Is this guide outdated?

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: subdomain - www is not able to add for google appengine

2009-04-30 Thread toh

Hi

I had the same experience about a year ago.  When I tried it again a
couple of months ago, it just magically worked.

I know I am not helping you in any way, but now you know it's not just
you...

On Apr 26, 4:12 am, VIJI vijiconn...@gmail.com wrote:
 Hello,

      I tried to point towww.mydoman.comto google appengine
 application.But the subdomain www is not getting added in the google
 application service settings page.

               If i try to add 'www' as subdomain ,there is no error
 message or any response.Could you please help anyone regarding the
 same?

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Able to login appengine in firefox but not in internetexplorer 7,8 and crome

2009-04-30 Thread Chinmay

I am able to login appengine in firefox but not in internetexplorer
7,8 and crome. What can be the problem.

Due to this reason when i try to deploy through eclipse plugin. I get
an unauthorized 401 error. You must authencate first.

Thanks in advance

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Google App Engine use in Sub Saharan Africa

2009-04-30 Thread Martin

Hello fellows,

I want to kick of a discussion about Google App Engine use in Sub
Saharan Africa. Following questions come to my mind:

1. How stable is the availability of apps down in Sub Saharan Africa;
does Google have server-farms there, or is the traffic just routed to
data-center somewhere in the Middle East?

2. How can African Entrepreneurs do a check out? Google Checkout does
not support so much African countries, so they have to search for an
agent e.g. in the US to do the billing for him?

Thanks for listening,

Martin

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Erlang OTP?

2009-04-30 Thread Kristofer


Is there any chance we can get a erlang-based version of GAE? I'm sure
y'all (mother google) have plenty of erlang expertise within the
virtual walls, and it'd be damn handy if we could host erlang apps in
GAEespecially since you've figured out how to do the same with
Java.

cheers!

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Timeout: datastore timeout: operation took too long.

2009-04-30 Thread Sylvain

For my app, I never fetch more than 250 entities because I've seen
that if this values is bigger you raise too many datastore timeouts.
But even with 250 entities (with a very basic Kind) something I get a
timeout.

One funny thing is that you can fetch up to 1000 entities (whatever
kind, number of attributes,...) but in the fact it doesn't work -
timeout.

On 30 avr, 17:45, barabaka oleg.g...@gmail.com wrote:
 Well, I've read a lot of posts about google datastore and the problems
 with batch operations, relational approach to arrange data in bigtable
 etc. but I always thought the problem wasn't in datastore itself but
 in the way people use it. Now I can see with my experience that it
 acts just in an unpredictable way. I deployed a test java app that
 tries to clear 500 (guaranteed amount!) entries per request. All
 entries are in the same entity group and delete is executed in batch
 in single transaction. All operations are executed with low level API
 so no possible overhead is involved. Here is a sample code and logs:

 Code (cut):
 =
 Query q = new Query(World.class.getSimpleName()); // create query
 IteratorEntity i = datastoreService.prepare(q).asIterator();
 idx = 0;
 while (i.hasNext()  idx500) {
    keys.add(i.next().getKey());
    idx++;

 }

 // delete keys in batch
 Transaction t = datastoreService.beginTransaction();
 datastoreService.delete(keys);
 t.commit();
 ==

 1st request (all goes well, 500 entries removed)
 -
    1.
       I 04-30 07:52AM 02.091 org.itvn.controller.TvnController
 clearDbBySize: Reading 500 entity keys...
       See details
    2.
       I 04-30 07:52AM 03.832 org.itvn.controller.TvnController
 clearDbBySize: Removing keys by groups, total groups: 1
    3.
       I 04-30 07:52AM 03.832 org.itvn.controller.TvnController
 clearDbBySize: Trying to remove 500 entities...
    4.
       I 04-30 07:52AM 07.873 org.itvn.controller.TvnController
 clearDbBySize: Removed 500 entities.

 2nd request - timeout exception, on READ operation (i.hasNext())
 -
    1.
       I 04-30 07:52AM 22.719 org.itvn.controller.TvnController
 clearDbBySize: Reading 500 entity keys...
       See details
    2.
       W 04-30 07:52AM 26.551 Nested in
 org.springframework.web.util.NestedServletException: Request
 processing failed; nested exception is
 com.google.appengine.api.datastore.Datas
    3.
       W 04-30 07:52AM 26.552 /clear_db/500
 com.google.appengine.api.datastore.DatastoreTimeoutException:
 datastore timeout: operation took too long. at
 com.google.appengine.api.d
    4.
       C 04-30 07:52AM 26.555 Uncaught exception from servlet
 com.google.appengine.api.datastore.DatastoreTimeoutException:
 datastore timeout: operation took too long. at com.goog

 Here we go, first request executes well, and the next (only a few
 seconds later) fails! Note that this is only a test application, with
 no load at all. Am I doing something wrong? What's the RELIABLE way to
 read/remove 500 entities? Is it a problem with quantity (500)? If so
 how much entities could be read without timeout? Can someone give the
 reasonable answer to this? If you need more details about app, I can
 share this test case in public.

 Oleg
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Google App Engine use in Sub Saharan Africa

2009-04-30 Thread Wooble



On Apr 30, 8:08 am, Martin martin.konz...@gmail.com wrote:
 1. How stable is the availability of apps down in Sub Saharan Africa;
 does Google have server-farms there, or is the traffic just routed to
 data-center somewhere in the Middle East?

I don't believe Google's going to reveal any information on server
location, but my impression is that most of not all of the servers
doing app engine serving now are in the US.  I could be completely
wrong.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: url mapping for url with hash

2009-04-30 Thread Barry Hunter

The hash part of the URL is not sent to the server at all.


You need to read it with Javascript.


On 30/04/2009, puff rbell01...@gmail.com wrote:

  I have a URL of the form 
 http://localhost:8080/gw/map#lat=42.371227435069805lon=-71.35208129882812zoom=11.
  Note that the URL contains a hash.

  How do I match this in the URL mapping?  I've tried various variations
  of '/gw/map(.*)' without success.

  Thanks for any help.

  Regards,
  puff

  



-- 
Barry

- www.nearby.org.uk - www.geograph.org.uk -

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Still no full-text search? Mystified by the priorities.

2009-04-30 Thread Waldemar Kornewald

While SearchableModel itself is rather limited the principle behind it
can be improved a lot. We'd be willing to sell our search app
(currently, only for Django, but webapp is planned). It comes with:
* ability to only index certain properties (instead of all string
properties as with SearchableModel)
* Porter Stemmers for English and German (you can search for 'cheap
cars' and find results with 'cheap car')
* word prefix search (match anything starting with ...)
* values index (allows for searching all values of a certain property;
e.g.: automatically generate a list of all tags of your blog posts and
make the tags themselves searchable for auto-completion)
* auto-completion via jQuery/AJAX for prefix search and values index
* easy to use views and templates for showing search results
* key-based pagination (only browsing entities without search
capability, though)
* some kind of coarse-grained sorting of results

In case you wondered, it does have all of App Engine's limitations:
* no result sorting
* no ranking
* no more than ca 5000 unique words per entity
But with the integrated Porter Stemmer you get much better search
results than with SearchableModel and you can make your website easier
to use by integrating auto-completion with just a few lines of code.

We want to setup a demo site, so you can see it in action. We plan to
publish everything in May.

Regarding the price: This package will be available for a one-time fee
and you can use it for an unlimited number of developers (i.e.: *no*
yearly per-developer license fee). Minor release upgrades will be
available for free, of course. Possibly, we might also provide a free
upgrade to a search app which is adapted to Google's full-text search
API when that gets released.

If you're interested and want to learn more please contact me at
wkornewald[at]gmail[dot]com

Bye,
Waldemar Kornewald
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Still no full-text search? Mystified by the priorities.

2009-04-30 Thread dalenewman

Looks like the java community already has this search business all
figured out :-)

http://www.kimchy.org/searchable-google-appengine-with-compass/

I guess this Compass thing must use Lucene and store the Lucene
indexes in the GAE data store.  Reminder; this is a guess based on a
quick skim of the blog post (link above).

Only time will tell if this works out.

Dale
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: I need help with passing variable values from datastore to html files.

2009-04-30 Thread Aaron

This is what I have so far:

def get_ad():
  total_count = Ad.all().count()
  all_Ads = Query(Ad).fetch(1000)
  memcache.add(total_count_of_ads, total_count)

  # assume total_ads to be less than 1000 for now
  a_random_Ad = random.randint(0, total_count - 1)
  my_random_Ad = all_Ads[a_random_Ad]
  #print my_random_Ad.name

  # if you are using Django templates, it can be passed to the HTMl
form as
  template_values = dict()
  template_values['ad_url'] = my_random_Ad.adurl

  # and then pass this template values and render the template
  # render_to_response(request, template, data, mimetype)
  template_path = string.join(os.path.dirname(__file__).split('/')
[:-1], '/')+ '/templates/root.html'

  t = get_template(template_path)
  html = t.render(Context(template_values))
  return html
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Still no full-text search? Mystified by the priorities.

2009-04-30 Thread Waldemar Kornewald

On Apr 30, 10:27 pm, dalenewman dalenew...@gmail.com wrote:
 Looks like the java community already has this search business all
 figured out :-)

 http://www.kimchy.org/searchable-google-appengine-with-compass/

 I guess this Compass thing must use Lucene and store the Lucene
 indexes in the GAE data store.  Reminder; this is a guess based on a
 quick skim of the blog post (link above).

 Only time will tell if this works out.

I highly doubt that the index can be updated efficiently in a single
request. This might work for a handful entities on the local
development server, but I'm sure it'll quickly break down if you have
a few 100 or 1000 items. Otherwise, if it were that trivial Google
could've provided that feature a long time ago.

I could imagine that if you used a script to remotely update the
search index you could actually get acceptable search performance
(i.e., on the already-built index), but I don't have any hard numbers
here and Shay Banon didn't know if the port would perform well on App
Engine, either.

Bye,
Waldemar Kornewald
--
Use Django on App Engine with app-engine-patch:
http://code.google.com/p/app-engine-patch/
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Index Building for a long time

2009-04-30 Thread Jason (Google)
Hi Tim. What is your application ID? It's possible that your indexes are in
a stuck state which happens from time to time.

- Jason

On Sat, Apr 25, 2009 at 3:38 PM, timwee tim.sh...@gmail.com wrote:


 I was wondering if anyone is facing the same issue as me...
 I have indexes on an app engine app that has been building for a long
 time.
 I don't really need a couple of those indexes anymore as part of model/
 entity changes, but I think it could be blocking the other ones that I
 need from building.
 Has this happened before to anyone else?
 how did you guys resolve it?

 Thanks,
 Tim

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] 'Too many repeated redirects' from The New York Times (and other sites)

2009-04-30 Thread James

I've discovered that when you urlfetch() a site from http://www.nytimes.com,
the returned document has status code 301, but when follow_redirects
is true, it returns a 'Too many repeated redirects' error.

example call:

from google.appengine.api import urlfetch
fetch_page = urlfetch.fetch(http://www.nytimes.com/2009/04/19/jobs/
19pre.html?_r=1, follow_redirects=True)


I'm guessing this might be because of the request user-agent or
something. Has anyone run into this sort of problem?

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: google eclipse plugin crush

2009-04-30 Thread Jason (Google)
These are odd issues. I haven't run into this myself or heard from others
with this problem, but you may want to cross-post in the GWT discussion
forum to see if anyone else has experienced something similar. If this is
only happening for the one project, then I would create a new project and
abandon the first -- hopefully it was just a fluke. If it's happening for
all new projects, you may want to try re-installing Eclipse and/or the
Google Plugin for Eclipse.

- Jason

On Tue, Apr 28, 2009 at 1:03 AM, ytbryan ytbr...@gmail.com wrote:


 hi all,

 something strange happened to my gwt application build using google
 eclipse plugin

 it was working fine yesterday but now, i am getting this error when i
 run.

 [DEBUG] Loading an instance of module 'visualise'

 ERROR] Unable to load module entry point class
 com.visual.client.Visualise (see associated exception for details)
 com.google.gwt.core.client.JavaScriptException: (TypeError): Undefined
 value
  line: 28
  sourceURL: jar:file:/Users/ytbryan/Desktop/jars/gwt-
 visualization.jar!/com/google/gwt/visualization/client/AjaxLoader.java
at com.google.gwt.visualization.client.AjaxLoader.loadApi(Native
 Method)
at
 com.google.gwt.visualization.client.AjaxLoader.loadVisualizationApi
 (AjaxLoader.java:38)
at
 com.google.gwt.visualization.client.AjaxLoader.loadVisualizationApi
 (AjaxLoader.java:31)
at com.visual.client.Visualise.onModuleLoad(Visualise.java:106)


 [ERROR] Failure to load module 'visualise'


 and one more serious error. my build path goes haywire. all my jar
 were gone and my app engine sdk went missing

 it took me a while to discover that and adding them back solve the
 problem. does anybody else have this problem 

 i am using latest eclipse google plugin and eclipse 3.4 and mac os x.
 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: os.mkdir, os.makedirs

2009-04-30 Thread Jason (Google)
Yes, but keep in mind that datastore entities are currently limited to 1 MB
in size.

- Jason

On Tue, Apr 28, 2009 at 5:07 AM, Barry Hunter
barrybhun...@googlemail.comwrote:


 On 28/04/2009, sahid sahid.ferdja...@gmail.com wrote:
 
   Hello,
 
   ok, so...
 
   all users have a possibility to upload many images
   and in a future other medias.
 
   i seek how to organize these data into my app.

 All data should go into the datastore.

 See
 http://code.google.com/appengine/articles/images.html

 
   Thanks
 
 
 
   On Apr 28, 1:00 pm, Tim Hoffman zutes...@gmail.com wrote:
Hi Sahid
   
I am not sure what you mean by 'arborescence'  (dictionary means a
tree)
What sort of user data are you trying to store and what do you want to
do with the data.
Why would it need to be a blob.
   
can't you create user based entities with a selection of properties
(of which some could be blobs)
You could certainly create a tree of entities in the datastore if that
is the most accurate way to map you data requirements.
   
You are probably better off trying to describe what you are trying to
achieve functionally
   
Rgds
   
T
   
On Apr 28, 5:14 pm, sahid sahid.ferdja...@gmail.com wrote:
   
 Ok Tim,
 thank for your response.
   
 so what is a best practice in app engine
 for create an arborescence with all data of my users.
   
 i must to use Datastore with blob property ?
   
 On Apr 28, 3:27 am, Tim Hoffman zutes...@gmail.com wrote:
   
  Yep not supported.
   
  You would need to have your entire directory created before you
 deploy
  (I doubt deploying will create empty directories )
   
  Remember you can't write to the filesystem at all, so doing a
 mkdir in
  code
  isn't very useful in gae.
   
  Rgds
   
  T
   
  On Apr 28, 2:16 am, sahid sahid.ferdja...@gmail.com wrote:
   
   Hello,
   
   appengine support, os.makedirs ?
   because i have this probleme :
   
   import os
   os.makedirs (PATH)
   
   'module' object has no attribute makedirs
   
 


 --
 Barry

 - www.nearby.org.uk - www.geograph.org.uk -

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Making the datastore readonly temporarily

2009-04-30 Thread Jason (Google)
I second djidjadji's suggestion. While it's true that your application will
be unavailable, it's definitely the simplest solution and gives your
users aclear expectation that some of your application's functionality
is not
available; otherwise, they might be confused why some aspects of your site
work but others don't, especially if your messaging is subtle.

The only other suggestion is to wrap all of your write calls to the
datastore so you can easily disable them altogether. That's tricker than the
first solution, however.

- Jason

2009/4/28 Alkis Evlogimenos ('Αλκης Ευλογημένος) evlogime...@gmail.com

 Yes but this means no access to the site for the duration. I want to have
 read only access to the site for the duration.

 2009/4/28 djidjadji djidja...@gmail.com


 If you don't have to do it often you can use the following method.

 Make a version of the application that displays a page that the site
 is temporarily under maintenance. Give an estimate for how long it
 will take.
 app.yaml redirects all requests to maintenance.py

 Find a time of day where the site is less busy.
 Make the maintenance version current.
 Update version X to the new schema.
 Do the update using  urls http://X.latest.myapp.appspot.com
 Test the update
 Make X the new version.

 This is the least hassle, I think.

 2009/4/28 Alkis Evlogimenos ('Αλκης Ευλογημένος) evlogime...@gmail.com:
  Sometimes you want to make the datastore readonly for users to perform
 some
  global changes (say schema update).
  How do people achieve this?
  Out of what I can think of:
  - Do you write another version of your application that errors on each
  request that writes to the datastore? This seems error prone and a
  maintenance headache.
  - Do you monkeypatch db.put and db.delete to unconditionally throw an
  exception and make that exception visible to the frontend?
  - Do you use hooks and pre hook datastore operations to throw an
 exception
  and make that exception visible to the frontend?
  Any other ideas?
  --
 
  Alkis





 --

 Alkis

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: How to reset my application in Google App Engine to default status (empty status)

2009-04-30 Thread Jason (Google)
You can also completely remove the content of your application files and
then re-deploy. This effectively clears the application.

- Jason

On Tue, Apr 28, 2009 at 6:15 AM, djidjadji djidja...@gmail.com wrote:


 2009/4/27 Eric Tsang erictsan...@gmail.com:
  For example ...
  1. To Clear the content of Application to empty 

 Use an empty application version. app.yaml redirects to non existing .py
 file

  2. to Clear the content of DataStore to empty 

 Use the dashboard

 http://appengine.google.com/dashboard?app_id=XXX

 For a big datastore you can use remote_api, see the article about it.

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Making the datastore readonly temporarily

2009-04-30 Thread 'Αλκης Ευλογημένος
I ended up with this:

class ReadOnlyError(db.Error):
  pass

def make_datastore_readonly():
  Throw ReadOnlyError on put and delete operations.
  def hook(service, call, request, response):
assert(service == 'datastore_v3')
if users.is_current_user_admin():
  return
if call in ('Put', 'Delete'):
  raise ReadOnlyError('Datastore is in read-only mode')

  apiproxy_stub_map.apiproxy.GetPreCallHooks().Push(
'readonly_datastore', hook, 'datastore_v3')

And special cased the ReadOnlyError exception in the read handler to print
out an error (503) that shows the site is on read only mode. Then I did the
migration, everything went fine :-)


2009/5/1 Jason (Google) apija...@google.com

 I second djidjadji's suggestion. While it's true that your application
 will be unavailable, it's definitely the simplest solution and gives your
 users a clear expectation that some of your application's functionality is
 not available; otherwise, they might be confused why some aspects of your
 site work but others don't, especially if your messaging is subtle.

 The only other suggestion is to wrap all of your write calls to the
 datastore so you can easily disable them altogether. That's tricker than the
 first solution, however.

 - Jason


 2009/4/28 Alkis Evlogimenos ('Αλκης Ευλογημένος) evlogime...@gmail.com

 Yes but this means no access to the site for the duration. I want to have
 read only access to the site for the duration.

 2009/4/28 djidjadji djidja...@gmail.com


 If you don't have to do it often you can use the following method.

 Make a version of the application that displays a page that the site
 is temporarily under maintenance. Give an estimate for how long it
 will take.
 app.yaml redirects all requests to maintenance.py

 Find a time of day where the site is less busy.
 Make the maintenance version current.
 Update version X to the new schema.
 Do the update using  urls http://X.latest.myapp.appspot.com
 Test the update
 Make X the new version.

 This is the least hassle, I think.

 2009/4/28 Alkis Evlogimenos ('Αλκης Ευλογημένος) evlogime...@gmail.com
 :
  Sometimes you want to make the datastore readonly for users to perform
 some
  global changes (say schema update).
  How do people achieve this?
  Out of what I can think of:
  - Do you write another version of your application that errors on each
  request that writes to the datastore? This seems error prone and a
  maintenance headache.
  - Do you monkeypatch db.put and db.delete to unconditionally throw an
  exception and make that exception visible to the frontend?
  - Do you use hooks and pre hook datastore operations to throw an
 exception
  and make that exception visible to the frontend?
  Any other ideas?
  --
 
  Alkis





 --

 Alkis




 



-- 

Alkis

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Google Developer assistance needed for two issues

2009-04-30 Thread Jason (Google)
Hi Carlos. I'm glad you were able to work through your issues, and I'm sorry
they took longer than expected to resolve. There were other reports of slow
index building around the same period, so this may very well have been
related. That said, you shouldn't have to delete any data in order to work
through this type of issue, and I'm sorry it came to this in your case.
Please let me know if you have any more indexing issues in the future, and
I'll try to resolve this a bit quicker.

- Jason

On Tue, Apr 28, 2009 at 7:03 AM, Carlos Pero carlos.p...@gmail.com wrote:


 Heaven helps those who help themselves...

 On Apr 27, 1:15 pm, Carlos Pero carlos.p...@gmail.com wrote:
  One of these indexes is necessary for my site, so further development
  is currently impaired...but since my app isn't live yet, I'm
  considering deleting all the entities in the datastore and then trying
  to delete the indexes again.

 I wrote a quick handler to delete my 500 entities, and then tried
 vacuum_indexes again.  This time it did change the status from Error
 to Deleting..., and a couple of minutes later they were gone!

 I was then able to recreate one of the five that I needed, so now I'm
 going to work on re-populating the datastore.

 Lessons learned:

 1.  Be very respectful of indexes, because they take along time to
 rebuild and delete. From now on I'm going to keep an eye on the
 autogenerated indexes and move them above the line when I'm sure I
 want to keep them, and delete any other autogenerated before
 deploying.

 2.  Need to become familiar with the newest datastore backup and
 restore procedures releases with the latest SDK, as they may come in
 handy in an emergency.


 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Strange behaviour

2009-04-30 Thread Liang Zhao

Perhaps you can click the catch link of this page in search result to
see exactly what has been indexed in your page?

On Fri, May 1, 2009 at 2:55 AM, Nora noorhanab...@yahoo.co.uk wrote:

 Hello,
 Googlebot indexed one page in my website but something that I can't
 understand is happening
 What I know about indexing is that if you search for any word
 contained in the page that has been indexed, this page should be
 displayed in the search results.  For example, If my page contains the
 word 'Hello' and I search for the word 'Hello', shouldn't my page(that
 has been indexed and contains this page)be included in the search
 results?

 Can anyone advise me on what is going wrong there!

 Thank you very much,
 Nora
 




-- 

Cheers!

Liang Zhao

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: cron TimeOut

2009-04-30 Thread Jason (Google)
As Nick said, 200 requests may be a bit high for a single request, so 2000
will very likely time out, yes. Of course, this partly depends on the
response time from the remote server.

While the try-catch idea is certainly more elegant, you may just want to set
up multiple Cron jobs at the beginning to hit only a subset of the URLs in
order to avoid possible time outs.

- Jason

2009/4/28 Roberto López roberto.lopez.del...@gmail.com

 if i want to request 2000 urls i will recive timeout ?

 2009/4/28 Tom Wu service.g2...@gmail.com

 30 seconds


 2009/4/28 Nick Johnson nick.john...@google.com


 You may be interested in the support for asynchronous URL fetching.
 Pubsubhubbub has a module for it here:

 http://code.google.com/p/pubsubhubbub/source/browse/trunk/hub/urlfetch_async.py
 . Examples of how to use it can be found elsewhere in the code.

 200 may still be too many for a single request, I'm not sure, but it's
 certainly more practical than fetching them serially. :)

 -Nick Johnson








 --
 Roberto López del Río


 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Problems with datastore timing out

2009-04-30 Thread Jason (Google)
Hi Jim. What is your application ID? Are you still seeing this behavior
consistently?

- Jason

On Wed, Apr 29, 2009 at 5:45 AM, Jim jdeib...@gmail.com wrote:


 I was having problems last night before I went to bed but figured it
 was temporary.   They were much worse overnight.   The log is full of
 messages like this:

  04-29 05:29AM 19.717 /url 500 9207ms 8088cpu_ms 8036api_cpu_ms
 0kb curl/7.18.2 (x86_64-pc-linux-gnu) libcurl/7.18.2 OpenSSL/0.9.8g
 zlib/1.2.3.3 libidn/1.10,gzip(gfe),gzip(gfe)
  See details


 As shown below, this just fetches one old phone number so it can be
 deleted.   I've attached it to a display function so that it's
 whittling away at the old data a number at a time.

 This normally takes about 300ms, so things are 25x slower and not
 working.

 I doubt this is just my own account but I don't see any other reports.

 GAE has now burned way through the free quota and into money.   What's
 the procedure for getting the quota reset?


  E 04-29 05:29AM 28.914

  Traceback (most recent call last):
File /base/python_lib/versions/1/google/appengine/ext/webapp/
 __init__.py, line 503, in __call__
  handler.post(*groups)
File /base/data/home/apps/opnwpcom/211.333117986628419056/
 owp.py, line 1059, in post
  x = DeleteOldNumber( 1 )
File /base/data/home/apps/opnwpcom/211.333117986628419056/
 owp.py, line 1493, in DeleteOldNumber
  numb = phone.all().fetch(1,offset=rand_int)
File /base/python_lib/versions/1/google/appengine/ext/db/
 __init__.py, line 1390, in fetch
  raw = self._get_query().Get(limit, offset)
File /base/python_lib/versions/1/google/appengine/api/
 datastore.py, line 942, in Get
  return self._Run(limit, offset)._Next(limit)
File /base/python_lib/versions/1/google/appengine/api/
 datastore.py, line 886, in _Run
  _ToDatastoreError(err)
File /base/python_lib/versions/1/google/appengine/api/
 datastore.py, line 1965, in _ToDatastoreError
  raise errors[err.application_error](err.error_detail)
  Timeout


 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Building a new index is taking way too long

2009-04-30 Thread Jason (Google)
Hi Alex. We definitely need to (and will) document this better, but index
build time is not directly proportional to the number of entities in the
datastore but to the number of other applications who are building indexes
at the same time. Since index building is managed by a fixed number of
workers, your application's indexes will take longer to build if there are
a higher number of applications building indexes. It appears that this
particular issue affected a number other developers earlier in the week.
We'll take a closer look at this problem and how it can be better mitigated
going forward, but it should be fairly uncommon for the most part.

- Jason

On Mon, Apr 27, 2009 at 11:30 PM, Alex Popescu 
the.mindstorm.mailingl...@gmail.com wrote:


 Is there any status update on this issue? I have noticed other posts
 on the list speaking about the same problem.

 ./alex

 On Apr 27, 2:59 pm, Alex Popescu the.mindstorm.mailingl...@gmail.com
 wrote:
  Hi,
 
  I have defined a new index for an entity based on a datetime and 1char
  string property. Currently there are only 58 such entities in the
  datastore and after 1h15min it is still not ready. I do not see any
  reasons why building the index for such a small amount of entities
  would take such a long time, except a problem with the backend.
 
  The application id: dailycloud.
 
  ./alex
 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: datastore timeout every time - on reads, not writes

2009-04-30 Thread tayknight

Data is returned in development almost instantly. In production, the
error is returned after about 4 seconds. I get the datastore error
long before 30 seconds.

On Apr 29, 9:08 am, Liang Zhao alpha@gmail.com wrote:
 Each request can only run 30 seconds in server side,

 but in development server, there is no limitation on it...





 On Wed, Apr 29, 2009 at 11:56 PM, tayknight taykni...@gmail.com wrote:

  And, I should add, this works perfecty (and quickly) from the
  development server's datastore.

  On Apr 29, 8:31 am, tayknight taykni...@gmail.com wrote:
  I have a problem. I'm getting datastore timeouts when doing reads. The
  code finished about 5% of the time. The code looks like:

  alerts = Alert.all().filter('expires = ', datetime.datetime.now())
  # ge active alerts
  for alert in alerts:
    #get the db.Keys from the ListProperty
    zones = ZoneMaster.get(alert.zones)
    for zone in zones:
      if zone:
        #get the users for this zone
        if zone.siteusers:
          us = SiteUser.get(zone.siteusers)
            for u in us:
              if u:
                self.response.out.write(u.name + 'br /')

  The Model looks like:
  class Alert(db.Model):
    effective = db.DateTimeProperty()
    expires = db.DateTimeProperty()
    zones = db.ListProperty(db.Key)

  class ZoneMaster(db.Model):
    siteusers = db.ListProperty(db.Key)

  class SiteUser(db.Model):
    name = db.StringProperty()
    zone = db.ReferenceProperty(ZoneMaster)

  This code is repeatably timing out with a Timeout: datastore timeout:
  operation took too long. error.
  I'm not doing any writes. All the reads are by key (that come from a
  ListProperty). Why would this be timing out?

  Thanks.

 --

 Cheers!

 Liang Zhao- Hide quoted text -

 - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] 411 length required on POST error - Content-length header specified

2009-04-30 Thread Shedokan

I have my app here:
http://shedokan-os.appspot.com/

at the start of the app it sends a post request to the server.
and instead of giving the content it supposed to it gives out the
error POST requests require a Content-length header.
and I did specify that kind of header.

what's wrong?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: datastore timeout every time - on reads, not writes

2009-04-30 Thread tayknight

I figured out that I needed to do a
alerts = Alert.all().filter('expires = ', datetime.datetime.now
()).fetch(1000)

Apparently it is faster to do a fetch() than iterate over the values.

On Apr 30, 7:28 pm, tayknight taykni...@gmail.com wrote:
 Data is returned in development almost instantly. In production, the
 error is returned after about 4 seconds. I get the datastore error
 long before 30 seconds.

 On Apr 29, 9:08 am, Liang Zhao alpha@gmail.com wrote:

  Each request can only run 30 seconds in server side,

  but in development server, there is no limitation on it...

  On Wed, Apr 29, 2009 at 11:56 PM, tayknight taykni...@gmail.com wrote:

   And, I should add, this works perfecty (and quickly) from the
   development server's datastore.

   On Apr 29, 8:31 am, tayknight taykni...@gmail.com wrote:
   I have a problem. I'm getting datastore timeouts when doing reads. The
   code finished about 5% of the time. The code looks like:

   alerts = Alert.all().filter('expires = ', datetime.datetime.now())
   # ge active alerts
   for alert in alerts:
     #get the db.Keys from the ListProperty
     zones = ZoneMaster.get(alert.zones)
     for zone in zones:
       if zone:
         #get the users for this zone
         if zone.siteusers:
           us = SiteUser.get(zone.siteusers)
             for u in us:
               if u:
                 self.response.out.write(u.name + 'br /')

   The Model looks like:
   class Alert(db.Model):
     effective = db.DateTimeProperty()
     expires = db.DateTimeProperty()
     zones = db.ListProperty(db.Key)

   class ZoneMaster(db.Model):
     siteusers = db.ListProperty(db.Key)

   class SiteUser(db.Model):
     name = db.StringProperty()
     zone = db.ReferenceProperty(ZoneMaster)

   This code is repeatably timing out with a Timeout: datastore timeout:
   operation took too long. error.
   I'm not doing any writes. All the reads are by key (that come from a
   ListProperty). Why would this be timing out?

   Thanks.

  --

  Cheers!

  Liang Zhao- Hide quoted text -

  - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: [appengine-java] Re: Some design Issues in appengine datastore

2009-04-30 Thread Max Ross
Hi Vijay,

Your questions are going to take some time to answer.  Some are easier than
others.  I'd recommend splitting them up into separate posts.

Max
On Thu, Apr 30, 2009 at 9:27 PM, vijay mymail.vi...@gmail.com wrote:

 Ping!!


 On Thu, Apr 30, 2009 at 5:03 PM, vijay mymail.vi...@gmail.com wrote:

 Hello All,I am working on an application and got stuck in design phase I
 hope you guyz can help me out. I have several doubts related to performance
 and modelling.

 1#
 In my application i store some hierarchical data and not sure how to do
 it.

 for e.g. say you have data organized as

 food
   |fruit
   | |___red---Ironapples
   | |  |   |something
   | |  |
   | |  |___Viatamins  litchies
   | |
   | |___greenIron-apples
   ||_Vitamin E - guava
   |
   |vegetable---category1 -category2category3
   |  |
   |  | |
   |  |
   | ||
   |fried|___
  |__similarly here

 Is there a clean way to store this kind of data in appengine. I mean what
 should be classes , their properties etc.

 In my case the sub categories can upto a depth of 10, with each level
 having 100s of categories.
 I will be executing queries to lookup by any node, so basically i can do
 lookup by fruit or apple or food as a whole.

 2#
 My applcation is going to have a search box where user will be writing his
 search items, I would like to suggest them the correct word if they do a
 spelling mistake, so if they type frut i wil suggest them to type fruit.
 The way i am thinking of implementing it is to for each word entered do a
 comparison with each node element with a relaxation of 2, so if the word
 matched except at two points i will go ahead and suggest it. I think as i
 need to have sequential traversal of all the nodes, which may take a lot of
 time and resources, I should store it somewhere to do faster lookup like in
 memcache, what are your suggestions.


 3#
 Are there any standard APIs to send sms from application. I asked about
 this in other thread but haven't got any reply, I hope I will get some
 answers this time.

 I am quiet new to web site designing and any other suggestion or some
 pointers to general design principles would be great.


 Regards,
 Vijay



 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: [appengine-java] Re: Some design Issues in appengine datastore

2009-04-30 Thread 风笑雪
Why don't you let the category to be a property of your model?
class Food(db.Model):
  category1 = ...
  category2 = ...
  ...

Food(category1=fruit, category2=red, ...)

2009/5/1 Max Ross maxr+appeng...@google.com maxr%2bappeng...@google.com

 Hi Vijay,

 Your questions are going to take some time to answer.  Some are easier than
 others.  I'd recommend splitting them up into separate posts.

 Max

 On Thu, Apr 30, 2009 at 9:27 PM, vijay mymail.vi...@gmail.com wrote:

 Ping!!


 On Thu, Apr 30, 2009 at 5:03 PM, vijay mymail.vi...@gmail.com wrote:

 Hello All,I am working on an application and got stuck in design phase I
 hope you guyz can help me out. I have several doubts related to performance
 and modelling.

 1#
 In my application i store some hierarchical data and not sure how to do
 it.

 for e.g. say you have data organized as

 food
   |fruit
   | |___red---Ironapples
   | |  |   |something
   | |  |
   | |  |___Viatamins  litchies
   | |
   | |___greenIron-apples
   ||_Vitamin E - guava
   |
   |vegetable---category1 -category2category3
   |  |
   |  | |
   |  |
   | ||
   |fried|___
  |__similarly here

 Is there a clean way to store this kind of data in appengine. I mean what
 should be classes , their properties etc.

 In my case the sub categories can upto a depth of 10, with each level
 having 100s of categories.
 I will be executing queries to lookup by any node, so basically i can do
 lookup by fruit or apple or food as a whole.

 2#
 My applcation is going to have a search box where user will be writing
 his search items, I would like to suggest them the correct word if they do a
 spelling mistake, so if they type frut i wil suggest them to type fruit.
 The way i am thinking of implementing it is to for each word entered do a
 comparison with each node element with a relaxation of 2, so if the word
 matched except at two points i will go ahead and suggest it. I think as i
 need to have sequential traversal of all the nodes, which may take a lot of
 time and resources, I should store it somewhere to do faster lookup like in
 memcache, what are your suggestions.


 3#
 Are there any standard APIs to send sms from application. I asked about
 this in other thread but haven't got any reply, I hope I will get some
 answers this time.

 I am quiet new to web site designing and any other suggestion or some
 pointers to general design principles would be great.


 Regards,
 Vijay






 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: [appengine-java] Re: Some design Issues in appengine datastore

2009-04-30 Thread Pranav Prakash

Hi Vijay,

You can try the following database model.

class FoodAndAll(db.Model):
  name = db.StringProperty()
  parent = db.SelfReferenceProperty()
  ...
  ...

In my view, this class does all your needs. The class has two
properties
1. Name
2. Reference to Parent Class
3. Other optional things

So, for the bottom level entities (those who are sub-category of a
category), the parent will refer to the category.
This will go on. It is not at all necessary for a parent to know his
child, but it is necessary that a child knows his parents. This is how
things will proceed here. I think this Model is best suited for you.

For searching, you can do whatever you feel like. Now you have just
one Model, one attribute (name). Use judiciously.

There is no SMS API.

Cheers,
--
Pranav Prakash

http://codecontrol.bogspot.com

On May 1, 10:08 am, 风笑雪 kea...@gmail.com wrote:
 Why don't you let the category to be a property of your model?
 class Food(db.Model):
   category1 = ...
   category2 = ...
   ...

 Food(category1=fruit, category2=red, ...)

 2009/5/1 Max Ross maxr+appeng...@google.com maxr%2bappeng...@google.com

  Hi Vijay,

  Your questions are going to take some time to answer.  Some are easier than
  others.  I'd recommend splitting them up into separate posts.

  Max

  On Thu, Apr 30, 2009 at 9:27 PM, vijay mymail.vi...@gmail.com wrote:

  Ping!!

  On Thu, Apr 30, 2009 at 5:03 PM, vijay mymail.vi...@gmail.com wrote:

  Hello All,I am working on an application and got stuck in design phase I
  hope you guyz can help me out. I have several doubts related to 
  performance
  and modelling.

  1#
  In my application i store some hierarchical data and not sure how to do
  it.

  for e.g. say you have data organized as

  food
|fruit
| |___red---Ironapples
| |  |   |something
| |  |
| |  |___Viatamins  litchies
| |
| |___greenIron-apples
||_Vitamin E - guava
|
|vegetable---category1 -category2category3
|  |
|  | |
|  |
| ||
|fried|___
   |__similarly here

  Is there a clean way to store this kind of data in appengine. I mean what
  should be classes , their properties etc.

  In my case the sub categories can upto a depth of 10, with each level
  having 100s of categories.
  I will be executing queries to lookup by any node, so basically i can do
  lookup by fruit or apple or food as a whole.

  2#
  My applcation is going to have a search box where user will be writing
  his search items, I would like to suggest them the correct word if they 
  do a
  spelling mistake, so if they type frut i wil suggest them to type fruit.
  The way i am thinking of implementing it is to for each word entered do a
  comparison with each node element with a relaxation of 2, so if the word
  matched except at two points i will go ahead and suggest it. I think as i
  need to have sequential traversal of all the nodes, which may take a lot 
  of
  time and resources, I should store it somewhere to do faster lookup like 
  in
  memcache, what are your suggestions.

  3#
  Are there any standard APIs to send sms from application. I asked about
  this in other thread but haven't got any reply, I hope I will get some
  answers this time.

  I am quiet new to web site designing and any other suggestion or some
  pointers to general design principles would be great.

  Regards,
  Vijay
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---