The "s~" also plays havoc when trying to add a HR app to a Google Apps
domain.
Try leaving out the "s~" and see if that works. It fixes the Google Apps
problem.
Nick
On 2 February 2011 18:30, Calvin wrote:
> I think you need --application=app-id in your command. Also you should
> verify that
I think you need --application=app-id in your command. Also you should
verify that /remote_api is responding on your server.
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroup
I tried that but it said my new datastore could not be found, and
somehow used my old one anyhow.
On Feb 2, 11:58 am, Tim Hoffman wrote:
> Wouldn't have a clue where yours ended up, but
> you can always define a specific datastore path --datastore_path=PATH ,
> then you will always know where it
Thank you!
On Feb 2, 1:30 pm, "Nick Johnson (Google)"
wrote:
> Hi Albert,
>
>
>
>
>
>
>
>
>
> On Wed, Feb 2, 2011 at 12:44 PM, Albert wrote:
> > Hi!
>
> > I just migrated to the High Replication datastore.
>
> > Some of my entities contain stringified keys of other entities.
>
> > For example:
>
I have similar problem, but slight different error message.
[ERROR2011-02-02 14:40:50,286 adaptive_thread_pool.py] Error in
Thread-1:
[DEBUG2011-02-02 14:40:50,301 adaptive_thread_pool.py] Traceback
(most recent call last):
File "C:\Documents and Settings\Administrator\My Documents
\poi
Hi Albert,
On Wed, Feb 2, 2011 at 12:44 PM, Albert wrote:
> Hi!
>
> I just migrated to the High Replication datastore.
>
> Some of my entities contain stringified keys of other entities.
>
> For example:
>
> Entity(db.Model):
> owner = db.StringProperty() # the str(Key) of the owner.
>
>
> Owner
Hi!
I'm performing this command. (I replaced my real app id with "app-id")
appcfg.py download_data --filename=download.dat --url=http://app-
id.appspot.com/remote_api
I'm getting this error.
google.appengine.api.datastore_errors.BadRequestError: app s~app-id
cannot
access app app-id's data
Wh
Is it possible to sign a .jar file within GAEJ? If not, such future
will be very useful and welcomed.
Thanks.
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubsc
Hi Greg,
When I set a rate of say 30/M, I basically always see all of my
tasks run in the first second or two then the queue sits empty for a
minute, then 30 run, etc So are you saying if I adjusted the
bucket size down to, say, 1, then they would at least trickle out over
30 seconds?
Rob
Wouldn't have a clue where yours ended up, but
you can always define a specific datastore path --datastore_path=PATH ,
then you will always know where it is.
I believe you can specify it in the launcher somewhere if you use that.
Rgds
T
--
You received this message because you are subscri
Hi!
I just migrated to the High Replication datastore.
Some of my entities contain stringified keys of other entities.
For example:
Entity(db.Model):
owner = db.StringProperty() # the str(Key) of the owner.
Owner(db.Model):
name = db.StringProperty()
Now that I've migrated to the high-replic
Hey Guys,
This morning I'm trying to clear my local datastore and receiving the
following error.
Could not read datastore data from c:\users\jay\appdata\local\temp
\dev_appserver.datastore
There is indeed no datastore at that location. Can anybody let me know
where the datastore might be so I ca
Hi
You may need it.
I have my own handler for deferred's as I need sys.path manipulated and a
number of imports performed
for deferred. If you need this sort of thing then you will.
Rgds
T
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" grou
Thanks that gets me in.
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to
google-appengine+unsubscr...@googlegroups.com.
Fo
I have built a set of websites on GAE. ePaymentExpress.net is the
"home" site, with links to eTicketExpress.com , eRegisterExpress.com
and eFundraiserExpress.com . They all use Python/Django and Google
Checkout for online payments. We are currently running two school
plays with assigned seating
Yes, we are looking at different ways of providing even more powerful
migration tools. How much data are you looking to migrate?
--
Ikai Lan
Developer Programs Engineer, Google App Engine
Blogger: http://googleappengine.blogspot.com
Reddit: http://www.reddit.com/r/appengine
Twitter: http://twitter
Thank you, Greg! This is excellent news.
On Feb 1, 4:29 pm, Greg Darke wrote:
> Hi HalcyonDays,
>
> In this case the system will add another token every 1/50th of a second.
>
> Note that the taskqueue system may batch up requests so that even
> though you will receive a new token every 1/50th of
Hi Sandeep,
I would have thought that by using Google Accounts to authenticate users it
would eliminate the need for having a login page altogether. If the user
isn't logged in then just let the User api forward them to Google's login
page and redirect them back to your site once they've logged in
@Doug - thanks for the positive feedback
@Kayode - sounds like an ultimatum - I either make my team available
for you or you are going to imitate it? Sounds like a nice approach to
doing business :) If you are really interested in the project, email
me privately.
--
You received this message bec
I am in favor of google auth for all the reasons you mentioned. It
makes things easier, more reliable and cheaper for me. My only concern
is that most of my new users feel skeptical at first glance when
providing their google credentials during the sign up process. During
my demos 90% of my users h
Hi HalcyonDays,
In this case the system will add another token every 1/50th of a second.
Note that the taskqueue system may batch up requests so that even
though you will receive a new token every 1/50th of a second it may
burst 2-3 (well, actually up to 5) tasks at a time.
On 2 February 2011 07
Hi Prashanth,
Add an additional property you can do an equality filter on, such as
'has_at_least_10_items.' If 'x' is a known amount that will solve
your issue by itself, if 'x' is a variable amount you might want to
use several 'bins' so that you can easily eliminate chucks of data in
the query
I have configured:
@Column(unique = true, nullable= false , name="email")
private String email;
But the error occurred:
java.lang.UnsupportedOperationException: No support for uniqueness
constraints
What I need to do in AppEngine to this field be unique? I need a way that I
can u
Oh and you can leave out zero. I assume you don't ship zero items, so
everything has to be greater than zero.
On Tue, Feb 1, 2011 at 1:32 PM, Stephen Johnson wrote:
> This won't entirely get rid of your need to loop, but you can add an
> indexed list property on your entity (let's call it shipped
Thanks for replying...
So in my example of rate=50/s and bucket-size=5:
When no tasks are in the queue, clearly the bucket fills up and has 5
tokens.
If I add 100 tasks, the first 5 will execute immediately and take all
5 tokens... but now the bucket is empty.
I have specified a rate of 50 tokens
This won't entirely get rid of your need to loop, but you can add an indexed
list property on your entity (let's call it shipped_increments) and put
certain markers in it to indicate that the entity is greater than a certain
amount of items shipped in certain increments (the increments would depend
Actually why would that matter if I have already authenticated them? What
difference does it make if they then log out from their Google Account?
On Tue, Feb 1, 2011 at 2:38 PM, nacho wrote:
> I use the UsersService and GWT (RPC) and it works great for me.
>
> The only thing that you have to tak
I use the UsersService and GWT (RPC) and it works great for me.
The only thing that you have to take care is that if the user logout from
other Google application in the same browser (for example gmail) will be
loged out from your application too and catch this situation.
--
You received this
Tasks are now allowed up to 10 minutes to complete their work. You still
have to deal with the 30 seconds max per datastore api call but that
shouldn't be a problem.
Use tasks to handle the heavy lifting and like Wim suggested use a cursor to
maintain your position in the result set between task i
Hi,
The rate is like an average upper bound; in other words, the
token-adding resolution is the unit of the rate you set. And, it is
'bursty,' so if you set a rate of 50/M you might very well have 50
tasks execute in the first second then the queue will not execute any
tasks for about 59 second
Thanks Jeff and Wim for the responses.
I missed out giving the number of records in the datastore.
The datastore has ~1 million records and ever increasing.
The solution which we have now is, get all the records greater than
the input date [most restrictive] in a query and then loop thru the
recor
If you intend to make multiple calls passing the cursor back and forth
between client and server then that would work but if you intend to do it
all in one request then I think the poster of the original question would
need to define how much data they were talking about which brings me back to
wha
no, no, first the query on is item_names and more than x_items,
then a loop on the result from this query (for items in resultquery:)
if some_input_date > date: append to list
when ready show list
you can play with fetch(..) to get optimum speed and cpu usage.
ofcourse not a top solution, but it w
If the amount of data is merely trivial then this would work but anything
more than merely trivial and the solution would have to include the use of
tasks.
On Tue, Feb 1, 2011 at 12:18 PM, Wim den Ouden wrote:
> with the help of the cursor (all items, while loop) select on more
> than x_items, w
This would require 2 inequality filters so no luck.
On Tue, Feb 1, 2011 at 7:17 AM, Prashanth wrote:
> Hi,
>
> Am trying to query my Google App Engine datastore [Python], which has
> a item_name, manufacturing_date and number_of_items_shipped.
> The scenario:
> Get all the item_names which has b
with the help of the cursor (all items, while loop) select on more
than x_items, within the loop check if after some_input_date, append
for example to python list or javascript array, when the loop is ready
show the list.
gr
wim
2011/2/1 Prashanth :
> Hi,
>
> Am trying to query my Google App Engin
Hi,
Am trying to query my Google App Engine datastore [Python], which has
a item_name, manufacturing_date and number_of_items_shipped.
The scenario:
Get all the item_names which has been shipped more than x_items [user
input] and manufactured after some_input_date [user input].
Basically, kind of
Hi,
I have 2-3 properties(Unowned Relationship keys) in my class which i will be
using in where clause to filter records.
Class PropertyAd{
Key locationKey; //This will point for which location this ad has been
created
Key categoryKey;//This will point for which category this ad has been
c
hi,
do i still need these entries in my app.yaml?
- url: /remote_api
script: $PYTHON_LIB/google/appengine/ext/remote_api/handler.py
login: admin
- url: /_ah/queue/deferred
script: $PYTHON_LIB/google/appengine/ext/deferred/handler.py
login: admin
im uring the remote_api: on in the built
I have billing enabled on my app and I'm trying to get the Cache-
control headers to edge cache a page. I'm setting the header to Cache-
Control: public; max-age=300; and I see it coming back. I'm using
wget to test it out. I've also included a "now" date in the generated
document so I can see i
python + jquery
2011/2/1 Jeff Schwartz :
> Are you also using GWT by chance?
>
> On Tue, Feb 1, 2011 at 9:15 AM, Wim den Ouden wrote:
>>
>> google accounts and the user api just works for me, allways there.
>> wim
>>
>> 2011/2/1 Jeff Schwartz :
>> > Hi all,
>> >
>> > I hope you don't mind me cros
Are you also using GWT by chance?
On Tue, Feb 1, 2011 at 9:15 AM, Wim den Ouden wrote:
> google accounts and the user api just works for me, allways there.
> wim
>
> 2011/2/1 Jeff Schwartz :
> > Hi all,
> >
> > I hope you don't mind me cross posting this to both the gwt and app
> engine
> > grou
Thanks Colin, you are right, no one at this time can run these
libraries using Google App Engine.
On 11 ene, 21:39, Colin Hawkett wrote:
> Perhaps this post will help
> -http://drools-java-rules-engine.46999.n3.nabble.com/drools-and-google...,
> I seem to remember something about those libraries
google accounts and the user api just works for me, allways there.
wim
2011/2/1 Jeff Schwartz :
> Hi all,
>
> I hope you don't mind me cross posting this to both the gwt and app engine
> groups since I'd really like to get the opinions of users on both platforms.
>
> I'm in the middle of developin
Hi all,
I hope you don't mind me cross posting this to both the gwt and app engine
groups since I'd really like to get the opinions of users on both platforms.
I'm in the middle of developing a gwt application on app engine. The
application's security requirements are that non members, meaning th
Your model, 'Campaigns' has multiple reference properties that can
potentially reference User (either explicitly, or via a non-specific
reference property). Add a collection_name to such reference
properties (only one can remain without a collection_name).
On Feb 1, 5:00 am, Tim Hoffman wrot
Hi Ikai,
Considering your recommendation is there any progress in providing an easier
migration path for existing applications?
On Mon, Jan 31, 2011 at 5:51 PM, Ikai Lan (Google)
> wrote:
> I'm going to let someone else answer this question, but we are recommending
> that all new applications u
1. Would the recommendations stand for a write-intensive
application? An example is, for example, a SalesForce.com type
system, where there is a balance between entering data and reading it.
2. Can we hope that in the future, tools will be available to help
with the migration? If, for example
On Mon, Jan 31, 2011 at 5:10 PM, barryhunter wrote:
> Why another thread?
>
> What was wrong with the last one?
>
> On Jan 31, 9:24 pm, Lee wrote:
> > What is the equivalent of using x-sendfile on GAE? How do I serve files
> from
> > the web server and not my application container?
>
> --
> You
Hi
Reputting won't help. The won't be in your data.
You probably more than one ReferenceProperty in different models with the
property called campaigns that point to the same Kind.
You will need to go over your models with a fine toothed comb.
Rgds
T
--
You received this message because y
Hi, I'm trying to run a mapreduce job and i'm getitng Error --
DuplicatePropertyError: Class Users already has campaigns_set.
Ive tried changing the collection name to something unique and
reputtinga ll my entities, but i'm still getting the error. The only
solution i've found is to comment out t
Great site. I think i'll work on an imitation :)
On Mon, Jan 31, 2011 at 6:03 PM, Ethan wrote:
> How long does it take to create the website? Just curious.
>
> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To post to this group, sen
Hello, Michael, is your team available to take on a GWT project that makes
use of
virtually all Google tools?
Kindly let me know.
Thanks
On Tue, Feb 1, 2011 at 2:32 AM, Doug wrote:
> Nice Site! Smooth and well done.
>
> On Jan 30, 3:09 pm, Michael Weinberg wrote:
> > Wanted to share our proje
Hello Philippe,
I am running under ubuntu. GAE supports python 2.5 not 2.7. I don't
know if this is the best or easiest way, but its the way I do it.
To up/down load data I followed:
http://code.google.com/appengine/docs/python/tools/uploadingdata.html
To configure the bulkloader.yaml I run:
$
Hello,
I'm transitioning an existing PHP + MySql web application over to
python GAE. It's essentially an inventory management application for a
plant nursery. The relevant mysql tables are basically:
Plant:
plant_idint(8) PRIMARY KEY autoincrement unsigned
sci_name
I'm trying to deploy my app but am getting an ssl based error
Creating staging directory
Scanning for jsp files.
Scanning files on local disk.
Initiating update.
javax.net.ssl.SSLKeyException: RSA premaster secret error
Unable to update:
javax.net.ssl.SSLKeyException: RSA premaster secret error
So I've read through a couple of things about the Task Queue's Token-
Bucket system, and I've taken a look at the wikipedia article to no
avail, so forgive me if this question has been answered elsewhere:
What is the resolution of the clock that deposits tokens into the
bucket for each queue? Is
57 matches
Mail list logo