[appengine-java] Using the DataNucleus parent-pk extension?

2009-12-20 Thread David Fuelling
Kind of an involved question, but I'm wondering if it's possible to
use the parent-key pattern with JPA to search for N child entities
of a given parent that have a particular property.  I can get this to
work if I include the parent object in my child entity, but I'd like
it to work when I just hold the parent's Key (as opposed to having to
hold a reference to the entire parent).

Some sample code below illustrates things.  I have two entities,
called Parent and Child.  Notice that my Parent class has a String-
based entity id, while the Child class has a GAE Key entity id.

The first example shows the Child class holding a full Parent
class.  In the setup below, I can execute a Query that says, select
key from Children where parent = :parent and tag = :tag, passing in a
full Parent object into the EntityManager like so:

//
// QUERY THAT WORKS
//
Parent parent = [get Parent object with id john from datastore]
[//Assume that a Child object exists in the DS who parent is john
and with a tag of the-tag]
em = EMF.get().createEntityManager();
Query jpaQuery = em.createQuery(select key from Child where parent
= :parent and tag = :tag);
jpaQuery.setParameter(tag, the-tag);
//Use the full parent object here...
jpaQuery.setParameter(parent, parent);
Key childKey = (Key) jpaQuery.getSingleResult();

///
// Parent Class
///
class Parent
{

private String id;
private ListChild children;

@Id
public String getId()
{ return id; }

@OneToMany(fetch = FetchType.EAGER, cascade = CascadeType.ALL,
mappedBy=parent)
public ListChild getChildren()
{ return this.children; }

}

///
// Child Class
///
class Child
{

private Key key;
private ListString tags;
private Parent parent;

@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private getKey()
{ return this.key; }

@ManyToOne(fetch = FetchType.EAGER, cascade = CascadeType.ALL)
public Parent getParent ()
{ return this.parent; }

...snip
}


If I try to use the parent-key pattern, things start to break down a
bit.  First, notice that the Child class no longer has an entire
Parent instance variable.  Instead, there is a Key field, and a
Datanucleus extension annotation.  In the Parent  class, the
mappedBy parameter has been removed because leaving it causes an
exception saying mappedBy fields must be complete Entities.


///
// Tweaked Parent Class
///
class Parent
{
private String id;
private ListChild children;

@Id
public String getId()
{ return id; }

@OneToMany(fetch = FetchType.EAGER, cascade = CascadeType.ALL)
public ListChild getChildren()
{ return this.children; }
}

///
// Tweaked Child Class
///
class Child
{

private Key key;
private ListString tags;
private Parent one;

@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private getKey()
{ return this.key; }

@ManyToOne(fetch = FetchType.LAZY, cascade = CascadeType.ALL)
@Extension(vendorName = datanucleus, key = gae.parent-pk, value =
true)
public Key getParentKey()
{ return this.parentKey; }

...snip

}

Using the above pattern, if I create an instance of Parent in the
datastore, the Child class is not created in the datastore, even
though I've created it in Java.  Executing the Query from above
produces: javax.persistence.NoResultException: No results for query:
SELECT key FROM...

Am I doing something wrong here?

--

You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.




[appengine-java] StackOverFlowError on JPA Remove() with @OneToMany and abstract base classes.

2009-12-20 Thread David Fuelling
I'm pretty sure my issue is a bug, but upon further reflection I
figured I'd bring it to light here in the discussion groups just to be
sure.

Here's the issue, with test code: 
http://code.google.com/p/googleappengine/issues/detail?id=2541

Basically, I'm trying to model a User entity that has more than 5,000
tags (Strings).  Since the GAE datastore can't store more
than 5,000 strings in a List attribute, I'm following the design
pattern outlined in this Google Tech Talk (basically, have the
User entity hold a List of Entity's, each of which hold a
ListString.  The 5000+1 string gets put into the 2nd
TagReferenceEntity in the User):
http://sites.google.com/site/io/building-scalable-web-applications-with-google-app-engine

I'm using JPA to implement this setup, and everything works except for
delete operations.  The google-code issue contains a stack-trace and
an attached .zip file with code to reproduce the problem (main error
is java.lang.StackOverflowError).

Thanks!

david

--

You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.




[appengine-java] Re: Generating QR Codes with ZXing

2009-12-20 Thread mably
More generally, has anybody heard of an open source barcode generator
java library running on GAE ?

We need to generate QR Codes or Datamatrix bar codes from our
application running on GAE.

We are using the Google Charts API for generating QR Codes for the
moment.

Thanx for you help.

On 19 déc, 18:50, mably fm2...@mably.com wrote:
 Has anyone successfully used Google ZXing library on GAE for
 generating QR Codes ?

 Thanx for your help.

 François

--

You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.




[appengine-java] JavaMail: AccessControlException warning

2009-12-20 Thread Pion
I am following the http://code.google.com/appengine/docs/java/mail/overview.html
instructions.

My code (snippets):

import java.io.PrintWriter;
import java.io.UnsupportedEncodingException;
import java.util.Properties;
import java.util.logging.Logger;

import javax.mail.Message;
import javax.mail.MessagingException;
import javax.mail.Session;
import javax.mail.Transport;
import javax.mail.internet.InternetAddress;
import javax.mail.internet.MimeMessage;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

public void execute(HttpServletRequest request, HttpServletResponse
response, PrintWriter out) {
String userName = System.getProperty
(com.galensystems.calendarEmail);
String body = This is a test;

Properties props = System.getProperties();

// Get a Session object
Session session = Session.getInstance(props, null);

Message message = new MimeMessage(session);
try {
message.setFrom(new InternetAddress(userName, Admin
Email));
message.addRecipient(Message.RecipientType.TO,
new 
InternetAddress(some-receipe...@gmail.com, Mr. John
Smith));
message.setSubject(Your Example.com account has been
activated);
message.setText(body);
Transport.send(message);
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
} catch (MessagingException e) {
e.printStackTrace();
}

}

After deploying and running it on GAE, I got the following log info/
warning exceptions below. I did not add Sun's JavaMail JARs to my app.
It did send the email successfully.  It showed up on my inbox as
expected.

My environments: GAE 1.3.0, Eclipse-Galileo.

Do I need to worry about the exceptions below?
Thanks in advance for your help.

12-20 07:05AM 49.119
com.google.appengine.repackaged.com.google.common.base.FinalizableReferenceQueue
init: Failed to start reference finalizer thread. Reference cleanup
will only occur when new references are created.
java.lang.reflect.InvocationTargetException
at com.google.appengine.runtime.Request.process-8ef88bb7edeb9b03
(Request.java)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Method.java:42)
at
com.google.appengine.repackaged.com.google.common.base.FinalizableReferenceQueue.init
(FinalizableReferenceQueue.java:124)
at
com.google.appengine.repackaged.com.google.common.labs.misc.InterningPools
$WeakInterningPool.clinit(InterningPools.java:104)
at
com.google.appengine.repackaged.com.google.common.labs.misc.InterningPools.newWeakInterningPool
(InterningPools.java:48)
at
com.google.appengine.repackaged.com.google.io.protocol.ProtocolSupport.clinit
(ProtocolSupport.java:55)
at com.google.appengine.api.mail.MailServicePb$MailMessage.init
(MailServicePb.java:643)
at com.google.appengine.api.mail.MailServicePb$MailMessage$1.init
(MailServicePb.java:1600)
at com.google.appengine.api.mail.MailServicePb$MailMessage.clinit
(MailServicePb.java:1600)
at com.google.appengine.api.mail.MailServiceImpl.doSend
(MailServiceImpl.java:49)
at com.google.appengine.api.mail.MailServiceImpl.send
(MailServiceImpl.java:32)
at com.google.appengine.api.mail.stdimpl.GMTransport.sendMessage
(GMTransport.java:247)
at javax.mail.Transport.send(Transport.java:95)
at javax.mail.Transport.send(Transport.java:48)
at com.galensystems.pchr.server.DoEmail.execute(DoEmail.java:57)
at com.galensystems.pchr.server.DoServer.main(DoServer.java:72)
at com.galensystems.pchr.server.DoServer.doPost(DoServer.java:43)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:713)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:806)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:
487)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter
(ServletHandler.java:1093)
at com.google.apphosting.utils.servlet.ParseBlobUploadFilter.doFilter
(ParseBlobUploadFilter.java:97)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter
(ServletHandler.java:1084)
at com.google.apphosting.runtime.jetty.SaveSessionFilter.doFilter
(SaveSessionFilter.java:35)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter
(ServletHandler.java:1084)
at
com.google.apphosting.utils.servlet.TransactionCleanupFilter.doFilter
(TransactionCleanupFilter.java:43)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter
(ServletHandler.java:1084)
at 

[appengine-java] Accessing GMail Inbox

2009-12-20 Thread Pion
The http://java.sun.com/products/javamail/FAQ.html#gmail shows how to
access Gmail Inbox using JavaMail.
//String host = imap.gmail.com;
String host = pop.gmail.com;
String username = user;
String password = passwd;
// ...
//Store store = session.getStore(imaps);
Store store = session.getStore(pop3s);
store.connect(host, username, password);
// ...

I tried the above code and deployed it on GAE. I received the
following log:

Unable to locate provider for protocol: pop3s

I switched to 'Store store = session.getStore(imaps);', It gave me
similar error.

Unable to locate provider for protocol: imaps

I am using GAE 1.3.0.

Please advise if this is even possible. If yes, what did I do wrong?
Thanks in advance for your help.

--

You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.




[appengine-java] Re: Generating QR Codes with ZXing

2009-12-20 Thread mably
Hi Roberto, thanx for your answer.

I finally succeeded generating QRCodes with the ZXing QRCode encoder
and a home made PNGEncoder (derivated from the ones found there :
http://catcode.com/pngencoder/).

But I'm still stuck with Datamatrix bar codes coz ZXing doesn't
generate them.

Any help will be greatly appreciated.

François

On 20 déc, 18:27, Roberto Saccon rsac...@gmail.com wrote:
 There is a sample app with ZXing, which runs on GAE, but I don't know
 exactly what it does, but I think to encode QRCodes (generating an
 image) isn't that difficult, I recently did that (just for a test) in
 Javascript, with an Open source code I found after  2 seconds of
 Googeling. If you want to decode Images, then I think the ZXing code
 has some AWT dependencies, which aren't GAE whitelisted.
 --
 Roberto

 On Dec 20, 12:16 pm, mably fm2...@mably.com wrote:



  More generally, has anybody heard of an open source barcode generator
  java library running on GAE ?

  We need to generate QR Codes or Datamatrix bar codes from our
  application running on GAE.

  We are using the Google Charts API for generating QR Codes for the
  moment.

  Thanx for you help.

  On 19 déc, 18:50, mably fm2...@mably.com wrote:

   Has anyone successfully used Google ZXing library on GAE for
   generating QR Codes ?

   Thanx for your help.

   François

--

You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.




[appengine-java] How to create a real or mock Key object without initializing a whole app engine environment?

2009-12-20 Thread Peter Recore
Is there a way to create a mock Key object, or to create a key object
without actually initializing an app engine enviroment?

I want to unit test some code that depends on a JDO object. In order
to create this JDO object, I need to pass in a Key, which it normally
uses as a reference to another object.  I don't need the Key to
actually be functional for this test, I just need it to return a
constant value for the getId() method.  Key is a final class, so I
can't create a subclass of it that meets my needs.  I also can't seem
to get KeyFactory to create a key without first registering an app
engine API.

Thanks,
-peter

--

You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.




[google-appengine] Task Queue problem: Permanent failure attempting to execute task

2009-12-20 Thread Alan Xing
Hi friends,

The following error keeps repeating every few minutes on our app. It seems
all these errors are from the version 1 code, while our app has been running
on version 3 for hours.

I wonder:

1. Are these errors really from version 1 code?
2. If so, how do I clean the old queued tasks for an old version before we
switch to a new version?

Thanks,
Alan


   1.  12-20 07:48AM 45.090 /_ah/queue/deferred 200 2287ms 680cpu_ms
0kb AppEngine-Google;
   (+http://code.google.com/appengine) See
detailshttps://appengine.google.com/logs/log_detail?app_id=snsanalyticsversion_id=3.338583298960072863request_id=00047B2AE88F1B7A.36E91D40layout=plain

   0.1.0.2 - - [20/Dec/2009:07:48:47 -0800] POST /_ah/queue/deferred
HTTP/1.1 200 124
http://snsanalytics.appspot.com/api/posting/rule/feed/cron_execute;
AppEngine-Google; (+http://code.google.com/appengine)
snsanalytics.appspot.com

2.  I 12-20 07:48AM 46.791

   X-Appengine-Taskretrycount:0, X-Appengine-Queuename:default,
X-Appengine-Taskname:15186975265483460081

3.  E 12-20 07:48AM 47.370

   Permanent failure attempting to execute task
   Traceback (most recent call last):
 File 
/base/python_lib/versions/1/google/appengine/ext/deferred/deferred.py,
line 255, in post
   run(self.request.body)
 File 
/base/python_lib/versions/1/google/appengine/ext/deferred/deferred.py,
line 122, in run
   raise PermanentTaskFailure(e)
   PermanentTaskFailure: Environment variable DJANGO_SETTINGS_MODULE
is undefined.

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




Re: [google-appengine] GAE Python Chinese feed url fetch problem

2009-12-20 Thread Alan Xing
ikai, you bet your money correctly. we confirmed your diagnose. The pain
point, though, is that there are quite a few popular feed service providers
having these kind of problems. sorry for replying  back to you late.

On Sat, Nov 21, 2009 at 3:40 AM, Ikai L (Google) ika...@google.com wrote:

 It looks like there may be an issue with feedsky and responding to
 Accepts-Encoding: gzip header. I changed urlfetch to work like this:

 content = urlfetch.fetch(url, headers={ Accept-Encoding : identity })

 It has not failed on me once. I also tried the following experiments. I'm
 just using straight curl here, no fancy programs. Note how I am cut off
 sometimes:

 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   100k  0  0:00:01  0:00:01 --:--:--
 127k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   103k  0  0:00:01  0:00:01 --:--:--
 133k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   103k  0  0:00:01  0:00:01 --:--:--
 133k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   103k  0  0:00:01  0:00:01 --:--:--
 132k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   103k  0  0:00:01  0:00:01 --:--:--
 132k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   103k  0  0:00:01  0:00:01 --:--:--
 133k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
  21  180k   21 398860 0   6885  0  0:00:26  0:00:05
 0:00:21 0
 curl: (18) transfer closed with 145287 bytes remaining to read
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   103k  0  0:00:01  0:00:01 --:--:--
 132k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: gzip
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
  21  180k   21 398860 0   6879  0  0:00:26  0:00:05
 0:00:21 0
 curl: (18) transfer closed with 145287 bytes remaining to read

 Here's the same test passing a different Accept-Encoding header:

 compu...@computer:/tmp$ curl -i -H Accept-Encoding: identity
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   115k  0  0:00:01  0:00:01 --:--:--
 154k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: identity
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   103k  0  0:00:01  0:00:01 --:--:--
 132k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: identity
 http://feed.feedsky.com/qiushi  blah
   % Total% Received % Xferd  Average Speed   TimeTime Time
 Current
  Dload  Upload   Total   SpentLeft
 Speed
 100  180k  100  180k0 0   115k  0  0:00:01  0:00:01 --:--:--
 155k
 compu...@computer:/tmp$ curl -i -H Accept-Encoding: identity
 

[google-appengine] Re: Is it possible to deploy Wave server on GAE ( will it be wihin free limits)

2009-12-20 Thread mdipierro
Sorry for my late reply. I cannot find it either but I remember seeing
it. You can ask the author of the site.
You can also look at some details instructions here:

http://www.web2pyslices.com/main/slices/take_slice/17

On Dec 15, 12:40 am, yadoo yado...@gmail.com wrote:
 Guys, where can I get the source code for the wave???

 On Dec 8, 11:04 pm, mdipierro mdipie...@cs.depaul.edu wrote:

  I am not sure if this is what you are asking but here is a google wave
  service running on google app engine:

 http://wavedirectory.appspot.com

  Here is some howto:

 http://wavedirectory.appspot.com/init/default/wave?w=googlewave.com%2...

  Massimo

  On Dec 6, 1:00 am, saurabh sagarwal1...@gmail.com wrote:

   Hi ,

   I am trying o make an application based on wave server functionality
   ( I dont want to use the Wave GUI ) just the wave server functionality
   of realtime collobration.

   But as fo now the no of wave users is very less and i cant ask my
   users to join google wave so I am thinking of deploying the wave
   server myself and iuse it as a backend. I want to know If I can deploy
   the wave server on GAE server , Has anybody tried it

   Thanks
   Saurabh



--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] Re: Noob: Python or Java???

2009-12-20 Thread Wiiboy
As far as I know, Server-side code and client-side are completely
separate (although I guess in some instances they may be similar).

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] why make adding a domain so difficult?

2009-12-20 Thread Locke
If I own the appid appid and the domin domian.com and I want to
use www.domain.com with appid.appspot.com, then why the heck can't I
simply prove my ownership by setting the CNAME appid-
owner.www.domain.com then pointing www.domin.com to ghs.google.com?

Why make me jump through all the hoops with Google Apps, a service I
have no intention of using?

KISS!


/me sees Sorry, you've reached a login page for a domain that isn't
using Google Apps again and bangs his head against the keyboard
nmsxzdwccfdxcsfsddxcsfvxfvdxcv

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] Re: Index building taking over 12h

2009-12-20 Thread Brade
GOOD LORD MY INDEX IS STILL BUILDING...

On Dec 18, 10:53 am, Brade bradez...@gmail.com wrote:
 This is one of those ugly truths about app engine, as glorious as it
 is otherwise.
 Perhaps there should be more emphasis on the necessity of deploying a
 new index definition before adding new features to one's app?
 My appwww.klection.comluckily is not mission critical but the
 aggregated comments section What You Say is about 12 hours along on
 its building status, as all I added was a __key__ desc index for
 comments (which there are less than 50 of right now). BTW my local dev
 environment failed to add this index automagically for some odd
 reason, so it took me a while even to troubleshoot the fact I needed
 to add it manually (because the functionality was working fine in my
 local environment).

 I will be starting fairly soon on my next GAE app, and these sorts of
 issues make me wary. As fantastic as it is in many respects, the
 indexing problems really affect the reliability of the system. It's
 nice to see the recent emphasis in the Articles section about the
 Datastore, but hopefully in the near future there will be better
 information reporting when deploying your app, because things that
 work great on your local dev environment should not completely break
 once deployed. That seems borderline unacceptable at present.

 --brad g.

 On Dec 3, 4:12 am, Nick Johnson (Google) nick.john...@google.com
 wrote:

  Hi,

  We're working on speeding up index building times. There's nothing you can
  do as a user to speed up index building.

  -Nick Johnson

  On Wed, Dec 2, 2009 at 6:35 PM, jpmorganuk
  market...@farrellheyworth.co.ukwrote:

   Hi Nick,

   What can be done to speed this up?

   Regards,
   jpmorganuk

   --

   You received this message because you are subscribed to the Google Groups
   Google App Engine group.
   To post to this group, send email to google-appeng...@googlegroups.com.
   To unsubscribe from this group, send email to
   google-appengine+unsubscr...@googlegroups.comgoogle-appengine%2bunsubscr...@googlegroups.com
   .
   For more options, visit this group at
  http://groups.google.com/group/google-appengine?hl=en.

  --
  Nick Johnson, Developer Programs Engineer, App Engine
  Google Ireland Ltd. :: Registered in Dublin, Ireland, Registration Number:
  368047

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] xmpp.send_invite doesn't work for non-google JID's

2009-12-20 Thread mrk
xmpp.send_invite does not send invite when from_jid parameter is set
to *...@app-id.appspotchat.com and jid is non-google (non-GTalk) JID.
My test app: 
http://xmpp-test.appspot.com/?from_jid=t...@text-xmpp.appspotchat.com
Can somebody confirm this, please? Just enter your non-google jid into
jid input and press invite button.

Thanks
--
mrk

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] Re: xmpp.send_invite doesn't work for non-google JID's

2009-12-20 Thread mrk
I've made mistake, valid test app url:
http://xmpp-test.appspot.com/?from_jid=t...@xmpp-test.appspotchat.com

--
mrk

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] Re: how to delete a table/entity?

2009-12-20 Thread Andy Freeman
You misunderstand.

If you have an ordering based on one or more indexed properties, you
can page efficiently wrt that ordering, regardless of the number of
data items.  (For the purposes of this discussion, __key__ is an
indexed property, but you don't have to use it or can use it just to
break ties.)

If you're fetching a large number of items and sorting so you can find
a contiguous subset, you're doing it wrong.

On Dec 19, 10:26 pm, ajaxer calid...@gmail.com wrote:
 obviously, if you have to page a data set more than 5 items which
 is not ordered by __key__,

 you may find that the __key__  is of no use, because the filtered data
 is ordered not by key.
 but by the fields value, and for that reason you need to loop query as
 you may like to do.

 but you will encounter a timeout exception before you really finished
 the action.

 On Dec 19, 8:26 am, Andy Freeman ana...@earthlink.net wrote:



if the type of data is larger than 1 items, you need reindexing
   for this result.
   and recount each time for getting the proper item.

  What kind of reindexing are you talking about.

  Global reindexing is only required when you change the indices in
  app.yaml.  It doesn't occur when you add more entities and or have big
  entities.

  Of course, when you change an entity, it gets reindexed, but that's a
  constant cost.

  Surely you're not planning to change all your entities fairly often,
  are you?  (You're going to have problems if you try to maintain
  sequence numbers and do insertions, but that doesn't scale anyway.)

it seems you have not encountered such a problem.
   on this situation, the indexes on the fields helps nothing for the
   bulk of  data you have to be sorted is really big.

  Actually I have.  I've even done difference and at-least-#
  (intersection and union are special cases - at-least-# also handles
  majority), at-most-# (binary xor is the only common case that I came
  up with), and combinations thereof on paged queries.

  Yes, I know that offset is limited to 1000 but that's irrelevant
  because the paging scheme under discussion doesn't use offset.  It
  keeps track of where it is using __key__ and indexed data values.

  On Dec 16, 7:56 pm, ajaxer calid...@gmail.com wrote:

   of course the time is related to the type data you are fetching by one
   query.

   if the type of data is larger than 1 items, you need reindexing
   for this result.
   and recount each time for getting the proper item.

   it seems you have not encountered such a problem.
   on this situation, the indexes on the fields helps nothing for the
   bulk of  data you have to be sorted is really big.

   On Dec 17, 12:20 am, Andy Freeman ana...@earthlink.net wrote:

 it still can result in timout if the data is really big

How so?  If you don't request too many items with a page query, it
won't time out.  You will run into runtime.DeadlineExceededErrors if
you try to use too many page queries for a given request, but 

 of no much use to most of us if we really have big data to sort and
 page.

You do know that the sorting for the page queries is done with the
indexing and not user code, right?  Query time is independent of the
total amount of data and depends only on the size of the result set.
(Indexing time is constant per inserted/updated entity.)

On Dec 16, 12:13 am, ajaxer calid...@gmail.com wrote:

 it is too complicated for most of us.
 and it still can result in timout if the data is really big

 of no much use to most of us if we really have big data to sort and
 page.

 On Dec 15, 11:35 pm, Stephen sdea...@gmail.com wrote:

  On Dec 15, 8:04 am, ajaxer calid...@gmail.com wrote:

   also 1000 index limit makes it not possible to fetcher older data 
   on
   paging.

   for if we need an indexed page more than 1 items,
   it would cost us a lot of cpu time to calculate the base for GQL
   to fetch the data with index less than 1000.

 http://code.google.com/appengine/articles/paging.html-Hidequotedtext-

 - Show quoted text -- Hide quoted text -

   - Show quoted text -- Hide quoted text -

 - Show quoted text -

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] Re: how to delete a table/entity?

2009-12-20 Thread ajaxer
You misunderstand.
if not show me a site with statistics on many fields.
with more than 1000 pages please.
thanks.

On Dec 21, 9:06 am, Andy Freeman ana...@earthlink.net wrote:
 You misunderstand.

 If you have an ordering based on one or more indexed properties, you
 can page efficiently wrt that ordering, regardless of the number of
 data items.  (For the purposes of this discussion, __key__ is an
 indexed property, but you don't have to use it or can use it just to
 break ties.)

 If you're fetching a large number of items and sorting so you can find
 a contiguous subset, you're doing it wrong.

 On Dec 19, 10:26 pm, ajaxer calid...@gmail.com wrote:



  obviously, if you have to page a data set more than 5 items which
  is not ordered by __key__,

  you may find that the __key__  is of no use, because the filtered data
  is ordered not by key.
  but by the fields value, and for that reason you need to loop query as
  you may like to do.

  but you will encounter a timeout exception before you really finished
  the action.

  On Dec 19, 8:26 am, Andy Freeman ana...@earthlink.net wrote:

 if the type of data is larger than 1 items, you need reindexing
for this result.
and recount each time for getting the proper item.

   What kind of reindexing are you talking about.

   Global reindexing is only required when you change the indices in
   app.yaml.  It doesn't occur when you add more entities and or have big
   entities.

   Of course, when you change an entity, it gets reindexed, but that's a
   constant cost.

   Surely you're not planning to change all your entities fairly often,
   are you?  (You're going to have problems if you try to maintain
   sequence numbers and do insertions, but that doesn't scale anyway.)

 it seems you have not encountered such a problem.
on this situation, the indexes on the fields helps nothing for the
bulk of  data you have to be sorted is really big.

   Actually I have.  I've even done difference and at-least-#
   (intersection and union are special cases - at-least-# also handles
   majority), at-most-# (binary xor is the only common case that I came
   up with), and combinations thereof on paged queries.

   Yes, I know that offset is limited to 1000 but that's irrelevant
   because the paging scheme under discussion doesn't use offset.  It
   keeps track of where it is using __key__ and indexed data values.

   On Dec 16, 7:56 pm, ajaxer calid...@gmail.com wrote:

of course the time is related to the type data you are fetching by one
query.

if the type of data is larger than 1 items, you need reindexing
for this result.
and recount each time for getting the proper item.

it seems you have not encountered such a problem.
on this situation, the indexes on the fields helps nothing for the
bulk of  data you have to be sorted is really big.

On Dec 17, 12:20 am, Andy Freeman ana...@earthlink.net wrote:

  it still can result in timout if the data is really big

 How so?  If you don't request too many items with a page query, it
 won't time out.  You will run into runtime.DeadlineExceededErrors if
 you try to use too many page queries for a given request, but 

  of no much use to most of us if we really have big data to sort and
  page.

 You do know that the sorting for the page queries is done with the
 indexing and not user code, right?  Query time is independent of the
 total amount of data and depends only on the size of the result set.
 (Indexing time is constant per inserted/updated entity.)

 On Dec 16, 12:13 am, ajaxer calid...@gmail.com wrote:

  it is too complicated for most of us.
  and it still can result in timout if the data is really big

  of no much use to most of us if we really have big data to sort and
  page.

  On Dec 15, 11:35 pm, Stephen sdea...@gmail.com wrote:

   On Dec 15, 8:04 am, ajaxer calid...@gmail.com wrote:

also 1000 index limit makes it not possible to fetcher older 
data on
paging.

for if we need an indexed page more than 1 items,
it would cost us a lot of cpu time to calculate the base for GQL
to fetch the data with index less than 1000.

  http://code.google.com/appengine/articles/paging.html-Hidequotedtext-

  - Show quoted text -- Hide quoted text -

- Show quoted text -- Hide quoted text -

  - Show quoted text -

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] Re: why make adding a domain so difficult?

2009-12-20 Thread Nickolas Daskalou
Yeah, I agree. The process of adding a domain to your app should be
simplified.


On Dec 21, 9:42 am, Locke locke2...@gmail.com wrote:
 If I own the appid appid and the domin domian.com and I want to
 usewww.domain.comwith appid.appspot.com, then why the heck can't I
 simply prove my ownership by setting the CNAME appid-
 owner.www.domain.comthen pointingwww.domin.comto ghs.google.com?

 Why make me jump through all the hoops with Google Apps, a service I
 have no intention of using?

 KISS!

 /me sees Sorry, you've reached a login page for a domain that isn't
 using Google Apps again and bangs his head against the keyboard
 nmsxzdwccfdxcsfsddxcsfvxfvdxcv

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.




[google-appengine] Images API questions

2009-12-20 Thread Nickolas Daskalou
In Python, if we're using the composite() function from the Images
API, is the size of each image layer allowed to be = 1MB, or does
the combined size of all the image layers have to be = 1MB (eg. 10 x
100KB layers is allowed, but 11 x 100KB is not)?

Can we use Blobstores when using composite()?

Exactly when is the LargeImageError exception thrown? Eg. the answers
to 1. and 2. below are Yes, but what about 3. and 4.?
1. If  1MB of data is passed as the image_data before any transforms/
composites are done?
2. If the final output data is  1MB?
3. Can the exception be thrown during the processing of the transforms/
composites, even if the final output data is = 1MB? If Yes, what are
the conditions that cause this?
4. If Yes to 3. = When using a Blobstore for image transforms, is the
upper limit for a LargeImageError exception higher than if the image
data is passed via the Image constructor?

--

You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.