[appengine-java] Re: Recurring tasks

2010-10-25 Thread Didier Durand
Hi,

I see different purposes for the 2:

- cron tasks are the tasks I always want to do for sure
- tasks via API are tasks that I schedule programmatically when needed
and triggered by event that I can't predict in advance.

I see personally 2 other purposes for API tasks:
  a) when I run a task and it comes close to the 30s limit, I pause
it, serialize its context and schedule a task with this context. When
the task starts, it's in fact a restart with another 30s credit for
running
  b) I also tasks in context of transactions if I want to be sure that
the action of the task is done (i.e datastore writes) even if the
transactions fails  - for example, store errors in datastore for
later analysis for a failing transaction / program, etc.

regards
didier


On Oct 25, 6:23 am, Vik vik@gmail.com wrote:
 Hie Thanks for the response.

 I am confused a bit. If cron job does the scheduling then what Task Queue
 does?

 Thankx and Regards

 Vik
 Founderwww.sakshum.comwww.sakshum.blogspot.com

 On Mon, Oct 25, 2010 at 8:51 AM, Didier Durand durand.did...@gmail.comwrote:

  Hi Vik,

  Tasks scheduled via cron.xml is the way I would go:
 http://code.google.com/appengine/docs/java/config/cron.html

  I would schedule a task every minute, make a query on the deadline for
  donors and then do what has to be done.

  didier

  On Oct 24, 7:11 pm, Vik vik@gmail.com wrote:
   Hie Guys

   Our application manages a list of blood donors. Time to time these blood
   donors are unreachable so the administrators can mark such blood donors
  as
   inactive.

   However, these blood donors should be active again after 1 day
   automatically. How should we achieve this? ?In regular J2EE apis we can
   write scheduler classes to do the same.
   What is the option in GAE? Are there any limitations?

   I went through a bit and feel like task queues are the way. But I am not
   sure? If yes then for above scenario how should it be done?

   Thankx and Regards

   Vik
   Founderwww.sakshum.comwww.sakshum.blogspot.com

  --
  You received this message because you are subscribed to the Google Groups
  Google App Engine for Java group.
  To post to this group, send email to
  google-appengine-j...@googlegroups.com.
  To unsubscribe from this group, send email to
  google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.com
  .
  For more options, visit this group at
 http://groups.google.com/group/google-appengine-java?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Feature Request: Restore prepared query from a cursor

2010-10-25 Thread Mouseclicker
Hi App Engine team,

the Cursor stores the full state of a prepared query to check if it
matches the query when applied and raise an exception if this is not
the case. Now the usual use case is that during a paging operation the
cursor is transferred between client (e.g HTML page, JSP, ...) and
server. The server restores the cursor from the string encoding and
applies it to the PreparedQuery.

As the server is stateless the server side needs to restore the
prepared query somehow either by caching it somewhere or by
reconstructing it from the application context (I take the
reconstruction approach)

Now I wonder if you could avoid caching/reconstructing the query if
you have the cursor, because apparently the cursor contains the full
state of the query. Wouldn't it be a trivial change in the API of the
SDK to introduce a method on the Cursor class like:

PreparedQuery getQuery()

Then you could have on the server have code like this:

Cursor decodedCursor = Cursor.fromWebSafeString(encodedCursor);
PreparedQuery preparedQuery = decodedCursor.getPreparedQuery();
List nextBatch =
preparedQuery.asQueryResultList(withLimit(20).cursor(decoded));

Not sure if I missed something but this seems to be trivial and it
would give so much convenience...

Comments?

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: 1.3.8 Console Logging Issue

2010-10-25 Thread andrew

Similar here.

Now only logs WARNING and ERROR.

Logging properties set to INFO and prior to 1.3.8 upgrade INFO logged
fine.

Have configuration procedures been changed or something?

Window 7 32bit
Eclipse 3.5 and 3.6

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: How can i allow user to download a zip file?

2010-10-25 Thread Vaclav Bartacek
Just call the method addStream( name_of_the_folder ):
   OutputStream os = zipStreamOutput.addStream( zipfolder/
subfolder );

Vaclav

On Oct 23, 7:32 pm, Bit Liner bitli...@gmail.com wrote:
 How can i add a folder to the zipStreamOutput?
 I cannot write bytes that represents a folder, is there a way to do
 it?

 On Sep 16, 9:07 am, Vaclav Bartacek vaclav.barta...@spolecne.cz
 wrote:

  Look at ZipStreamOutput class 
  from:http://code.google.com/p/audao/source/browse/#svn/trunk/modules/embed...

  The usage in GAE or generally in servlets is as simple as following:

  private void responseAsZipStream( HttpServletResponse response )
  throws IOException {
          // prepare buffers:
          ByteArrayOutputStream bos = new ByteArrayOutputStream();
          ZipStreamOutput output = new ZipStreamOutput( bos );

          byte[] buf = new byte[4096];

          // now create the ZIP archive as you want:
          OutputStream os1 = output.addStream( file1.txt );
          // write to the output stream 1 , but do not close it -
  ZipStreamOutput does it itself:
          os1.write( whatever );
          os1.flush();
          
          OutputStream os2 = output.addStream( file2.txt );
          // write to the output stream 2 , but do not close it -
  ZipStreamOutput does it itself:
          os2.write( whatever );
          os2.flush();
          

          // finish the ZIP stream:
          output.finish();

          // now pass it as the HTTP response:
          byte[] data = bos.toByteArray();

          response.setContentType( application/zip );
          response.setContentLength( data.length );
          response.setHeader( Content-Disposition, inline; filename=
  + filename.zip);
          response.getOutputStream().write( data );

  }

  This works on gae - I use it at  audao.spoledge.com

  Vaclav

  On Sep 15, 3:19 am, Bit Liner bitli...@gmail.com wrote:

   My app creates dinamically two files so that then the users can
   download these files in one zip file.

   But i have problem in implementing this operations.

   Some suggestion to help me?
   Library, if gae supports this operations, etc.

   (i have tried to use gaeVfs, but i have met problems: i cannot write
   the content of a file on the response, so i can download the file but
   its content is empty )

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: Disappointment about JPA relationships :(

2010-10-25 Thread Simon
I agree the documentation isn't fabulous for JPA, in that a vast
majority of it focuses on the JDO implementation and you end up having
to work out which bits of DataNucleus documentation should be used and
is relevant for GAE.  However, to give the team a bit of credit the
restrictions you are talking about are clearly documented at
http://code.google.com/appengine/docs/java/datastore/transactions.html.

Having been through the pain with trying a JPA-based implementation
and reading through these message groups, it is quite clear that JDO
and JPA are just a very poor fit for the underlying datastore - using
the low-level API, or one of the open-source libraries such as
Objectify, Twig or Slim3 is probably a safer bet.

On Oct 25, 4:01 am, Shawn Brown big.coffee.lo...@gmail.com wrote:
  I'm just here because i feel i need to rant a little. I came here expecting
  way too much.

 Been there with JDO -- the docs are not adequate.

 I don't know your exact requirements but I suspect you'll find many on
 this list who found objectify to be the simplest convenient interface
 to the Google App Engine 
 datastore.http://code.google.com/p/objectify-appengine/

 Shawn

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: File size when downloading files is missing

2010-10-25 Thread rapher
So, i've played a bit and tested some things.

Before serving the blobkey i now set the header of the respond this
way:

res.setHeader(Content-Length, Long.toString(blobInfo.getSize() ));
res.setHeader(Content-Size, Long.toString(blobInfo.getSize() ));

Setting the header works, the respond says:

HTTP/1.1 200 OK
Content-Size: 66882
Content-Type: application/pdf
Date: Mon, 25 Oct 2010 08:10:15 GMT
Server: Google Frontend
Content-Length: 66882
Connection: close


But, all my browsers (I've testest with Safari, Firefox and Chrome)
still are not able to determine the filesize when downloading...

Do you have any idea what else i can try?

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: Uploading to blobstore gives OutOfMemoryError

2010-10-25 Thread zen
Despite all the documentations and examples you have missed adding the
enctype!


add this to the form enctype=multipart/form-data you should have no
problems.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



Re: [appengine-java] Re: Uploading to blobstore gives OutOfMemoryError

2010-10-25 Thread Adrian Petrescu
No, we had the enctype; that's what the uploadForm.setEncoding(FormPanel.
ENCODING_MULTIPART); was about. As several of us have discovered, it's the
name we were missing. The requirement to have encoding set is
well-documented, the name is not so much.

And the point remains that even if it is documented, running out of memory
NOT the right way to respond to it, because any malicious user can send a
POST request with no name.

On Mon, Oct 25, 2010 at 2:42 AM, zen donny.bidda...@gmail.com wrote:

 Despite all the documentations and examples you have missed adding the
 enctype!

 add this to the form enctype=multipart/form-data you should have no
 problems.

 --
 You received this message because you are subscribed to the Google Groups
 Google App Engine for Java group.
 To post to this group, send email to
 google-appengine-j...@googlegroups.com.
 To unsubscribe from this group, send email to
 google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.com
 .
 For more options, visit this group at
 http://groups.google.com/group/google-appengine-java?hl=en.


-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



Re: [appengine-java] Google App Engine on Netbeans

2010-10-25 Thread Khor Yong Hao
You may refer to
http://java.wildstartech.com/Java-Platform-Enterprise-Edition/JavaServer-Faces/sun-javaserver-faces-reference-implementation/configuring-jsf-20-to-run-on-the-google-app-engine-using-netbeans

I had successed last time for configuring JSF+Facelets on Google Appengine
using Netbean 6.8, but failed in 6.9

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: Testing with JDO

2010-10-25 Thread Daniel Blasco
Thank you Didier.

This is my BaseTest.java and works perfect now:

package ...;

import static org.junit.Assert.assertEquals;

import java.util.Date;
import java.util.List;

import javax.jdo.PersistenceManager;

import org.junit.After;
import org.junit.Before;
import org.junit.Test;

import com.declaraciones.PMF;
import com.declaraciones.shared.Declaracion.EstadoDeclaracion;
import com.declaraciones.shared.Movimiento;
import
com.google.appengine.tools.development.testing.LocalDatastoreServiceTestConfig;
import
com.google.appengine.tools.development.testing.LocalServiceTestHelper;

public class BaseTest {
private final LocalServiceTestHelper dsHelper = new
LocalServiceTestHelper(new LocalDatastoreServiceTestConfig());
private final PersistenceManager pm = PMF.getPersistenceManager();

@Before
public void setUp() {
System.out.println(Entering @Before for BaseTest);
dsHelper.setUp();
}

@After
public void tearDown() {
System.out.println(Entering @After for BaseTest);
this.dsHelper.tearDown();
}

// run this test twice to prove we're not leaking any state across
tests
@SuppressWarnings(unchecked)
private void doTest() {
String query = select from  + Movimiento.class.getName();
assertEquals(0, ((ListMovimiento)
pm.newQuery(query).execute()).size());
Movimiento m1 = new Movimiento();
pm.makePersistent(m1);
assertEquals(1, ((ListMovimiento)
pm.newQuery(query).execute()).size());
Movimiento m2 = new Movimiento();
pm.makePersistent(m2);
assertEquals(2, ((ListMovimiento)
pm.newQuery(query).execute()).size());
}

@Test
public void testInsert1() {
doTest();
}

@Test
public void testInsert2() {
doTest();
}

}

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Create any kind of file on GAE/J then upload to Google docs.

2010-10-25 Thread Nurettin Omer Hamzaoglu
Hi,

Is it possible to create any kind of file on GAE/J and upload to
Google docs? I've asked a similar question about creating PDF and
uploading to Google docs but seems nobody tried or successful before.
Anyone have any suggestions or sample code?

Thanks.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: Testing with JDO

2010-10-25 Thread Didier Durand
Daniel,
You're welcome.
didier

On Oct 25, 12:40 pm, Daniel Blasco dab...@gmail.com wrote:
 Thank you Didier.

 This is my BaseTest.java and works perfect now:

 package ...;

 import static org.junit.Assert.assertEquals;

 import java.util.Date;
 import java.util.List;

 import javax.jdo.PersistenceManager;

 import org.junit.After;
 import org.junit.Before;
 import org.junit.Test;

 import com.declaraciones.PMF;
 import com.declaraciones.shared.Declaracion.EstadoDeclaracion;
 import com.declaraciones.shared.Movimiento;
 import
 com.google.appengine.tools.development.testing.LocalDatastoreServiceTestConfig;
 import
 com.google.appengine.tools.development.testing.LocalServiceTestHelper;

 public class BaseTest {
         private final LocalServiceTestHelper dsHelper = new
 LocalServiceTestHelper(new LocalDatastoreServiceTestConfig());
         private final PersistenceManager pm = PMF.getPersistenceManager();

         @Before
         public void setUp() {
                 System.out.println(Entering @Before for BaseTest);
                 dsHelper.setUp();
         }

         @After
         public void tearDown() {
                 System.out.println(Entering @After for BaseTest);
                 this.dsHelper.tearDown();
         }

         // run this test twice to prove we're not leaking any state across
 tests
         @SuppressWarnings(unchecked)
         private void doTest() {
                 String query = select from  + Movimiento.class.getName();
                 assertEquals(0, ((ListMovimiento)
 pm.newQuery(query).execute()).size());
                 Movimiento m1 = new Movimiento();
                 pm.makePersistent(m1);
                 assertEquals(1, ((ListMovimiento)
 pm.newQuery(query).execute()).size());
                 Movimiento m2 = new Movimiento();
                 pm.makePersistent(m2);
                 assertEquals(2, ((ListMovimiento)
 pm.newQuery(query).execute()).size());
         }

         @Test
         public void testInsert1() {
                 doTest();
         }

         @Test
         public void testInsert2() {
                 doTest();
         }

 }

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: Recurring tasks

2010-10-25 Thread Didier Durand
Hi Vik,


I would also go with cron  job in your case.

Concerning the 30s limit, it depends on your requirements in
computing:

a) you can and should probably schedule via api an independent  task
for each separate donor who needs to be processed after your query
(via cron) told you who needs to be processed today.

b) then,for each individual task, if it lasts more than 30s then you
need to do what I described earlier (i.e pause the current run after
storing its context and restart it in a new task)

regards
didier

On Oct 25, 12:35 pm, Vik vik@gmail.com wrote:
 Hie

 Hmm, so in my case I always know in advance that I need to activate few
 blood donors every day  which were deactivated yesterday.

 So, this qualifies for cron jobs. Isn't it ? However, does this 30s limit
 applies here as well?

 Thankx and Regards

 Vik
 Founderwww.sakshum.comwww.sakshum.blogspot.com

 On Mon, Oct 25, 2010 at 11:56 AM, Didier Durand 
 durand.did...@gmail.comwrote:

  Hi,

  I see different purposes for the 2:

  - cron tasks are the tasks I always want to do for sure
  - tasks via API are tasks that I schedule programmatically when needed
  and triggered by event that I can't predict in advance.

  I see personally 2 other purposes for API tasks:
   a) when I run a task and it comes close to the 30s limit, I pause
  it, serialize its context and schedule a task with this context. When
  the task starts, it's in fact a restart with another 30s credit for
  running
   b) I also tasks in context of transactions if I want to be sure that
  the action of the task is done (i.e datastore writes) even if the
  transactions fails  - for example, store errors in datastore for
  later analysis for a failing transaction / program, etc.

  regards
  didier

  On Oct 25, 6:23 am, Vik vik@gmail.com wrote:
   Hie Thanks for the response.

   I am confused a bit. If cron job does the scheduling then what Task Queue
   does?

   Thankx and Regards

   Vik
   Founderwww.sakshum.comwww.sakshum.blogspot.com

   On Mon, Oct 25, 2010 at 8:51 AM, Didier Durand durand.did...@gmail.com
  wrote:

Hi Vik,

Tasks scheduled via cron.xml is the way I would go:
   http://code.google.com/appengine/docs/java/config/cron.html

I would schedule a task every minute, make a query on the deadline for
donors and then do what has to be done.

didier

On Oct 24, 7:11 pm, Vik vik@gmail.com wrote:
 Hie Guys

 Our application manages a list of blood donors. Time to time these
  blood
 donors are unreachable so the administrators can mark such blood
  donors
as
 inactive.

 However, these blood donors should be active again after 1 day
 automatically. How should we achieve this? ?In regular J2EE apis we
  can
 write scheduler classes to do the same.
 What is the option in GAE? Are there any limitations?

 I went through a bit and feel like task queues are the way. But I am
  not
 sure? If yes then for above scenario how should it be done?

 Thankx and Regards

 Vik
 Founderwww.sakshum.comwww.sakshum.blogspot.com

--
You received this message because you are subscribed to the Google
  Groups
Google App Engine for Java group.
To post to this group, send email to
google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to
google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.com
  google-appengine-java%2bunsubscr...@googlegroups.comgoogle-appengine-java%252bunsubscr...@googlegroups.com

.
For more options, visit this group at
   http://groups.google.com/group/google-appengine-java?hl=en.

  --
  You received this message because you are subscribed to the Google Groups
  Google App Engine for Java group.
  To post to this group, send email to
  google-appengine-j...@googlegroups.com.
  To unsubscribe from this group, send email to
  google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.com
  .
  For more options, visit this group at
 http://groups.google.com/group/google-appengine-java?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



Re: [appengine-java] Re: Recurring tasks

2010-10-25 Thread Vik
Hie

thanks a lot!!!

So, how about starting a corn job every day say at 00:00  and if time
consumption goes more than 30 sec than scheduling tasks to complete
leftovers.


I have another question here: The data model for this is:

BloodDonor_name, blood_group, is_active, active_date



so query to find donors to be activated ids gonna be:

select BloodDonor  where is_active == false  active_date ==  today's
date


and update gonna be setting such donors:  is_active=true and active_date =
null.


Any idea how many such entries is safe to consider will be processed within
30 secs ?

Thankx and Regards

Vik
Founder
www.sakshum.com
www.sakshum.blogspot.com


On Mon, Oct 25, 2010 at 5:53 PM, Didier Durand durand.did...@gmail.comwrote:

 Hi Vik,


 I would also go with cron  job in your case.

 Concerning the 30s limit, it depends on your requirements in
 computing:

 a) you can and should probably schedule via api an independent  task
 for each separate donor who needs to be processed after your query
 (via cron) told you who needs to be processed today.

 b) then,for each individual task, if it lasts more than 30s then you
 need to do what I described earlier (i.e pause the current run after
 storing its context and restart it in a new task)

 regards
 didier

 On Oct 25, 12:35 pm, Vik vik@gmail.com wrote:
  Hie
 
  Hmm, so in my case I always know in advance that I need to activate few
  blood donors every day  which were deactivated yesterday.
 
  So, this qualifies for cron jobs. Isn't it ? However, does this 30s limit
  applies here as well?
 
  Thankx and Regards
 
  Vik
  Founderwww.sakshum.comwww.sakshum.blogspot.com
 
  On Mon, Oct 25, 2010 at 11:56 AM, Didier Durand durand.did...@gmail.com
 wrote:
 
   Hi,
 
   I see different purposes for the 2:
 
   - cron tasks are the tasks I always want to do for sure
   - tasks via API are tasks that I schedule programmatically when needed
   and triggered by event that I can't predict in advance.
 
   I see personally 2 other purposes for API tasks:
a) when I run a task and it comes close to the 30s limit, I pause
   it, serialize its context and schedule a task with this context. When
   the task starts, it's in fact a restart with another 30s credit for
   running
b) I also tasks in context of transactions if I want to be sure that
   the action of the task is done (i.e datastore writes) even if the
   transactions fails  - for example, store errors in datastore for
   later analysis for a failing transaction / program, etc.
 
   regards
   didier
 
   On Oct 25, 6:23 am, Vik vik@gmail.com wrote:
Hie Thanks for the response.
 
I am confused a bit. If cron job does the scheduling then what Task
 Queue
does?
 
Thankx and Regards
 
Vik
Founderwww.sakshum.comwww.sakshum.blogspot.com
 
On Mon, Oct 25, 2010 at 8:51 AM, Didier Durand 
 durand.did...@gmail.com
   wrote:
 
 Hi Vik,
 
 Tasks scheduled via cron.xml is the way I would go:
http://code.google.com/appengine/docs/java/config/cron.html
 
 I would schedule a task every minute, make a query on the deadline
 for
 donors and then do what has to be done.
 
 didier
 
 On Oct 24, 7:11 pm, Vik vik@gmail.com wrote:
  Hie Guys
 
  Our application manages a list of blood donors. Time to time
 these
   blood
  donors are unreachable so the administrators can mark such blood
   donors
 as
  inactive.
 
  However, these blood donors should be active again after 1 day
  automatically. How should we achieve this? ?In regular J2EE apis
 we
   can
  write scheduler classes to do the same.
  What is the option in GAE? Are there any limitations?
 
  I went through a bit and feel like task queues are the way. But I
 am
   not
  sure? If yes then for above scenario how should it be done?
 
  Thankx and Regards
 
  Vik
  Founderwww.sakshum.comwww.sakshum.blogspot.com
 
 --
 You received this message because you are subscribed to the Google
   Groups
 Google App Engine for Java group.
 To post to this group, send email to
 google-appengine-j...@googlegroups.com.
 To unsubscribe from this group, send email to
 google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.com
 google-appengine-java%2bunsubscr...@googlegroups.comgoogle-appengine-java%252bunsubscr...@googlegroups.com
 
   google-appengine-java%2bunsubscr...@googlegroups.comgoogle-appengine-java%252bunsubscr...@googlegroups.com
 google-appengine-java%252bunsubscr...@googlegroups.comgoogle-appengine-java%25252bunsubscr...@googlegroups.com
 
 
 .
 For more options, visit this group at
http://groups.google.com/group/google-appengine-java?hl=en.
 
   --
   You received this message because you are subscribed to the Google
 Groups
   Google App Engine for Java group.
   To post to this group, send email to
   google-appengine-j...@googlegroups.com.
   To unsubscribe 

[appengine-java] Re: Channel API still not live...

2010-10-25 Thread stole
Hello everybody.

Following Ikai's instructions I signed up for Channel API trusted
tester last Friday (2010/10/22). The API has not yet been enabled for
my application. I was wondering if anybody has gotten the API
activated and how long it took for it to get activated. Ikai? Daniel?
Heiko?

Thanks for your time.

On Sep 20, 4:47 pm, Ikai Lan (Google) ikai.l+gro...@google.com
wrote:
 Channel API is not available yet. You can sign up for trusted tester here:

 https://spreadsheets.google.com/a/google.com/viewform?formkey=dGFxQ1A...







 On Mon, Sep 20, 2010 at 2:38 AM, Heiko Roth r...@egotec.com wrote:
  Hello there,

  We need channel api, too.
  Can we use it?

  Greetings,
  Heiko.

  --
  You received this message because you are subscribed to the Google Groups
  Google App Engine for Java group.
  To post to this group, send email to
  google-appengine-j...@googlegroups.com.
  To unsubscribe from this group, send email to
  google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2B 
  unsubscr...@googlegroups.com
  .
  For more options, visit this group at
 http://groups.google.com/group/google-appengine-java?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: Recurring tasks

2010-10-25 Thread Didier Durand
Hi Vik,

To be fully transparent, I stopped using JDO and switched to
Objectify: much more simple, efficient  transparent than JDO (just
check how many issues on this forum with jdo). Learning curve is very
quick.

Wih Objectify, if your entities are not too big in size and well
indexed to match your queries, you can hope for many tenths of task
start (each with its own donor to process) in the course of 30s. I am
probably conservative.

The advantage of starting additional tasks via API is parallelism: all
your donors will be processed simultaneously and then update job is
gonna be finished much faster overall. Be careful though if you have
global counters updated by many tasks in parallel: you may enter
concurrency race issues. But you can post again by then if needed to
solve those new issues...

regards
didier

On Oct 25, 2:45 pm, Vik vik@gmail.com wrote:
 Hie

 thanks a lot!!!

 So, how about starting a corn job every day say at 00:00  and if time
 consumption goes more than 30 sec than scheduling tasks to complete
 leftovers.

 I have another question here: The data model for this is:

 BloodDonor_name, blood_group, is_active, active_date

 so query to find donors to be activated ids gonna be:

 select BloodDonor  where is_active == false  active_date ==  today's
 date

 and update gonna be setting such donors:  is_active=true and active_date =
 null.

 Any idea how many such entries is safe to consider will be processed within
 30 secs ?

 Thankx and Regards

 Vik
 Founderwww.sakshum.comwww.sakshum.blogspot.com

 On Mon, Oct 25, 2010 at 5:53 PM, Didier Durand durand.did...@gmail.comwrote:

  Hi Vik,

  I would also go with cron  job in your case.

  Concerning the 30s limit, it depends on your requirements in
  computing:

  a) you can and should probably schedule via api an independent  task
  for each separate donor who needs to be processed after your query
  (via cron) told you who needs to be processed today.

  b) then,for each individual task, if it lasts more than 30s then you
  need to do what I described earlier (i.e pause the current run after
  storing its context and restart it in a new task)

  regards
  didier

  On Oct 25, 12:35 pm, Vik vik@gmail.com wrote:
   Hie

   Hmm, so in my case I always know in advance that I need to activate few
   blood donors every day  which were deactivated yesterday.

   So, this qualifies for cron jobs. Isn't it ? However, does this 30s limit
   applies here as well?

   Thankx and Regards

   Vik
   Founderwww.sakshum.comwww.sakshum.blogspot.com

   On Mon, Oct 25, 2010 at 11:56 AM, Didier Durand durand.did...@gmail.com
  wrote:

Hi,

I see different purposes for the 2:

- cron tasks are the tasks I always want to do for sure
- tasks via API are tasks that I schedule programmatically when needed
and triggered by event that I can't predict in advance.

I see personally 2 other purposes for API tasks:
 a) when I run a task and it comes close to the 30s limit, I pause
it, serialize its context and schedule a task with this context. When
the task starts, it's in fact a restart with another 30s credit for
running
 b) I also tasks in context of transactions if I want to be sure that
the action of the task is done (i.e datastore writes) even if the
transactions fails  - for example, store errors in datastore for
later analysis for a failing transaction / program, etc.

regards
didier

On Oct 25, 6:23 am, Vik vik@gmail.com wrote:
 Hie Thanks for the response.

 I am confused a bit. If cron job does the scheduling then what Task
  Queue
 does?

 Thankx and Regards

 Vik
 Founderwww.sakshum.comwww.sakshum.blogspot.com

 On Mon, Oct 25, 2010 at 8:51 AM, Didier Durand 
  durand.did...@gmail.com
wrote:

  Hi Vik,

  Tasks scheduled via cron.xml is the way I would go:
 http://code.google.com/appengine/docs/java/config/cron.html

  I would schedule a task every minute, make a query on the deadline
  for
  donors and then do what has to be done.

  didier

  On Oct 24, 7:11 pm, Vik vik@gmail.com wrote:
   Hie Guys

   Our application manages a list of blood donors. Time to time
  these
blood
   donors are unreachable so the administrators can mark such blood
donors
  as
   inactive.

   However, these blood donors should be active again after 1 day
   automatically. How should we achieve this? ?In regular J2EE apis
  we
can
   write scheduler classes to do the same.
   What is the option in GAE? Are there any limitations?

   I went through a bit and feel like task queues are the way. But I
  am
not
   sure? If yes then for above scenario how should it be done?

   Thankx and Regards

   Vik
   Founderwww.sakshum.comwww.sakshum.blogspot.com

  --
  You received this message because you are subscribed to the Google
Groups
  Google App 

[appengine-java] Re: Disappointment about JPA relationships :(

2010-10-25 Thread Matthieu Bertin
That is very true. The page you show is quite explaining why i got
theses exceptions, but i wouldn't have understood anything before i ran
into theses exceptions, and not even understanding if this page was JDO
or JPA documentation.


Anyway, i got it now and i'm back on tracks, but i would have gladly
enjoyed not living my last 10 days. ^^

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: 1.3.8 Console Logging Issue

2010-10-25 Thread Rud
I am seeing the same problem but thought is was due to new GWT 2.1RC1
logging capability.

See thread at 
http://groups.google.com/group/google-web-toolkit/browse_frm/thread/c5afa6655edec800#

I just tested my app with 1.3.7 and .8. With .7 it works and not with .
8. I am only testing on the desktop, not live on the web.

Another GWT developer reports that reverting GWT addresses the
problem. I don't know if that changes GAE or Jetty, either of which
may be part of the problem.

Rud
http://www.mysticlakesoftware.com


On Oct 25, 1:32 am, andrew aute...@gmail.com wrote:
 Similar here.

 Now only logs WARNING and ERROR.

 Logging properties set to INFO and prior to 1.3.8 upgrade INFO logged
 fine.

 Have configuration procedures been changed or something?

 Window 7 32bit
 Eclipse 3.5 and 3.6

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: Channel API still not live...

2010-10-25 Thread Robert Lancer
They will notify you if you get picked, they dont activate it for
everyone just a few testers.


On Oct 25, 10:47 am, stole goran.stoj...@gmail.com wrote:
 Hello everybody.

 Following Ikai's instructions I signed up for Channel API trusted
 tester last Friday (2010/10/22). The API has not yet been enabled for
 my application. I was wondering if anybody has gotten the API
 activated and how long it took for it to get activated. Ikai? Daniel?
 Heiko?

 Thanks for your time.

 On Sep 20, 4:47 pm, Ikai Lan (Google) ikai.l+gro...@google.com
 wrote:







  Channel API is not available yet. You can sign up for trusted tester here:

 https://spreadsheets.google.com/a/google.com/viewform?formkey=dGFxQ1A...

  On Mon, Sep 20, 2010 at 2:38 AM, Heiko Roth r...@egotec.com wrote:
   Hello there,

   We need channel api, too.
   Can we use it?

   Greetings,
   Heiko.

   --
   You received this message because you are subscribed to the Google Groups
   Google App Engine for Java group.
   To post to this group, send email to
   google-appengine-j...@googlegroups.com.
   To unsubscribe from this group, send email to
   google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2B
unsubscr...@googlegroups.com
   .
   For more options, visit this group at
  http://groups.google.com/group/google-appengine-java?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: 1.3.8 Console Logging Issue

2010-10-25 Thread vvorski
Same here with python on a Mac. Have heard reports of same with Java
on windows. Seems to be a global 1.3.8 problem?

V/.

On Oct 24, 9:04 pm, jt john.j.tho...@gmail.com wrote:
 Hello,

 I upgraded my project from 1.3.4 to 1.3.8. After the upgrade, the
 console fails to log. If I switch back to 1.3.4, the console logs
 properly.

 I am on a 64-bit Windows Vista environment, running Eclipse 3.4.2 w/
 (MyEclipse and Instantiations GWT tools)

 Thanks

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] OpenID Accounts with same emails and App Engine User class

2010-10-25 Thread trustamli
Hi,

For some reason my application handles OpenID accounts with same
emails as one user. It means when I try to
access datastore entities with User field of new user it also gets
entities of all other users with the same email.

Is User object same as email? Pythod documentation says: User
instances are unique and comparable. If two instances are equal, then
they represent the same user. (but I use Java).

I think I'm doing something wrong here:

 Query query = pm.newQuery(Subscriber.class, user == userParam);
 query.declareImports(import com.google.appengine.api.users.User);
 query.declareParameters(User userParam);

 @SuppressWarnings(unchecked)
 ListSubscriber results = (ListSubscriber) query.execute(user);
 if (results.size()!=0)
Logger.log(User logged in, results.iterator().next().getId());

?

With this problem using OpenID is very unsafe. Anybody can create an
OpenID account with any email (for example with myOpenID), without
email verification, and then log in to my application and get data
related to all Users with this email.

I believe I'm doing something wrong, so I'll really hope someone will
open my eyes.

Thanks,
Best Regards.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: mapreduce cron

2010-10-25 Thread Benjamin
Hi Jacek,
there is an example in mapper project source code:
http://code.google.com/p/appengine-mapreduce/source/browse/trunk/java/example/com/google/appengine/demos/mapreduce/TestServlet.java

Hope it helps.

On Oct 23, 12:03 am, jacek.ambroziak jacek.ambroz...@gmail.com
wrote:
 How to start a MapReduce job as a cron scheduled task?

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Datastore viewer error + cant delete

2010-10-25 Thread Eran
Hi,
Can anyone help :

In the app engine management screen, pressing the datastore viewer
brings an error screen with no details.
So, I started searching for corrupt data , and after writing some code
found possible instances in the data store where items with duplicate
primary keys seem to be committed.
I then wrote some code to try and delete them , calling
PersistenceManager.deletePersistent  with these items, and while no
exceptions where thrown , they are still in the datastore

any ideas?
The application itself seems to be working fine, but I am afraid I am
missing something bigger here.

Thanks in advance

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] SDK/Eclipse pains on Ubuntu

2010-10-25 Thread John
This is a two parter - first to express some frustration (well, quite
a lot really) and second, to ask for support/help.

It feels like every time I update my GAE Eclipse plugin, my dev
environment breaks.

The reason for the break varies, this time around my classpath is
completely broken - starts with data nucleus not working (jars
duplicated into lib directory), fix that then get ClassNotFound
exceptions at runtime (GAE classes not on classpath at runtime), fix
that (library order) then find that the jars have copied down to the
lib directory, back to square one.  There are various other problems -
I've spent all bar one hour of today trying to get my environment
working again following an update.

This seems to happen every time I do a GAE upgrade, but this time
around is more broken than previously.

At the moment, it's all still broken - I'm contemplating making a new
project and copying java source files across.

Is there a fundamental duh, stupid thing that I'm not doing, or have
not set up that makes this work?  Is anyone else experiencing such
pain?

I'm using a Eclipse/JEE Web Application Project with GAE, but not GWT
enabled.  It has previously worked fine (between SDK updates) and I've
previously been able to fix the break post-update - not this time
though.

Help much appreciated.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] jsf view id's expire too quick

2010-10-25 Thread scalpel4k
Hi,

I'm running an application that runs on myfaces 2.0.2. When I run the
application on the appserver I constantly get
javax.faces.application.ViewExpiredException.
MyFaces stores viewIds in the normal servlet session store. Although
session timeout is set to 30 minutes, the viewIds seem to be
invalidated after a minute or so.
When I run the application on my local development server everything
works well.

Does anybody know what might be wrong?

bye Mich;

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



Re: [appengine-java] DatastoreNeedIndexException

2010-10-25 Thread Ikai Lan (Google)
You need to look at your application and the type of queries you are
performing. Don't index properties you don't need to search on. Read more
about this here:

http://code.google.com/appengine/docs/python/datastore/queriesandindexes.html

--
Ikai Lan
Developer Programs Engineer, Google App Engine
Blogger: http://googleappengine.blogspot.com
Reddit: http://www.reddit.com/r/appengine
Twitter: http://twitter.com/app_engine



On Sun, Oct 24, 2010 at 12:16 AM, pman pollk...@gmail.com wrote:

 I now have problem with
 com.google.appengine.api.datastore.DatastoreNeedIndexException: no
 matching index found...

 And, the admin Dashboard shown
 ===
 Number of Indexes
 100%
100%199 of 200

 ===


 the application is a relatively simple, yet got such problem.  i
 wonder how many datastore indexes u guys have.

 pls share it here.

 p/s: google appengine team, can increase the quota for index ...?



 --
 You received this message because you are subscribed to the Google Groups
 Google App Engine for Java group.
 To post to this group, send email to
 google-appengine-j...@googlegroups.com.
 To unsubscribe from this group, send email to
 google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.com
 .
 For more options, visit this group at
 http://groups.google.com/group/google-appengine-java?hl=en.



-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



Re: [appengine-java] Re: Recommended maximum number of entities in an entity group

2010-10-25 Thread Ikai Lan (Google)
I was unclear in my last point: real world performance should be above 1
transaction per second, but you should engineer your entity group writes
with an expectation of 1 transaction/second for best results.

--
Ikai Lan
Developer Programs Engineer, Google App Engine
Blogger: http://googleappengine.blogspot.com
Reddit: http://www.reddit.com/r/appengine
Twitter: http://twitter.com/app_engine



On Sat, Oct 23, 2010 at 1:53 AM, nicanor.babula nicanor.bab...@gmail.comwrote:

 Acually, I forgot about the fact that while a transaction is going on,
 all entities involved are locked. Therefore, I considered changing my
 database structure in order to get more, but smaller entity groups
 (every parentEntity will have at most 20-30 childs). That way, will be
 easier and faster to use transactions.

 On 22 Ott, 19:54, Ikai Lan (Google) 
 ikai.l+gro...@google.comikai.l%2bgro...@google.com
 
 wrote:
  Cyrille is right: have an expectation of 1 transaction/second, though
 real
  world performance should be below this. App Engine favors a model of
  eventual consistency. You don't get to scale by locking a ton of entities
 at
  once.
 
  --
  Ikai Lan
  Developer Programs Engineer, Google App Engine
  Blogger:http://googleappengine.blogspot.com
  Reddit:http://www.reddit.com/r/appengine
  Twitter:http://twitter.com/app_engine
 
 
 
  On Fri, Oct 22, 2010 at 3:54 AM, Cyrille Vincey crll...@gmail.com
 wrote:
   From my experience : I do NOT expect a better write performance than 1
   transaction/second when creating entities inside one given entity group
   (with 1 entity created in each transaction).
 
   In your case : if dataset creation is an offline process, you can rely
 on
   entity groups and parent/child data modelling, no matter how many child
   entities you want to store. But you will have to expect high datastore
   contention level.
 
   My suggestion : in your data model design, only use parent/child design
   when transactional features are REALLY required.
   If transactional requirement are not so high, prefer to break your data
   model into smaller entity groups.
 
   On 21/10/10 17:00, nicanor.babula nicanor.bab...@gmail.com wrote:
 
   Thank you all for your responses.
   @alesj
   No, I am not confusing entity group with the actual entities stored
   in the datastore.
 
   @Ian Marshall
   Actually no, because I already did that kind of analysis.
 
   I have to use transactions in order to maintain data consistency and
   therefore I have to define the right entity groups. I already thought
   of a solution, just that I have read in the docs that is not
   recommended to put too many entities in an entity group. So: What is
   the number of child entities a parent entity can have, past which the
   datastore becomes slow? Hundreds? Thousands? Millions?
   I also understand that all entities in an entity group are stored on
   the same node of the datastore's distributed system and therefore I
   understand that if the number of entities an entity group has is too
   big, the queries will become slow, because will be processed by a
   single node. Right?
   Again: Which is that number?
 
   Thanks and sorry if I bored you with my long email. ;)
 
   On 21 Ott, 14:59, Ian Marshall ianmarshall...@gmail.com wrote:
How about my comments below?
 
   
 http://www.google.com/url?url=http://groups.google.com/g/f907f736/t/f.
   ..
 
Do they help you?
 
On Oct 20, 6:39 pm, nicanor.babula nicanor.bab...@gmail.com
 wrote:
 
 Hi everbody,
 
 I have a question regarding the datastore best-practices.
 
 The appengine's official documentation says that is not a good
 practice to put too many entities in the same entity group. What
 does
 too many mean in this case? Hundreds? Thousands? Milions?
 
 Thanks in advance,
 Cristian Babula.
 
   --
   You received this message because you are subscribed to the Google
 Groups
   Google App Engine for Java group.
   To post to this group, send email to
   google-appengine-j...@googlegroups.com.
   To unsubscribe from this group, send email to
   google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.comgoogle-appengine-java%2
 bunsubscr...@googlegroups.com
   .
   For more options, visit this group at
   http://groups.google.com/group/google-appengine-java?hl=en.
 
   --
   You received this message because you are subscribed to the Google
 Groups
   Google App Engine for Java group.
   To post to this group, send email to
   google-appengine-j...@googlegroups.com.
   To unsubscribe from this group, send email to
   google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.comgoogle-appengine-java%2B
 unsubscr...@googlegroups.com
   .
   For more options, visit this group at
  http://groups.google.com/group/google-appengine-java?hl=en.

 --
 You received this message because you are subscribed to the Google Groups
 Google App Engine for Java group.
 

[appengine-java] DeadlineExceededException from URLFetchService

2010-10-25 Thread hector
Would it be possible to throw DeadlineExceededException from the
URLFetchService instead of IOException?

java.io.IOException: Timeout while fetching: https://...
at
com.google.appengine.api.urlfetch.URLFetchServiceImpl.convertApplicationException(URLFetchServiceImpl.java:
108)
at
com.google.appengine.api.urlfetch.URLFetchServiceImpl.fetch(URLFetchServiceImpl.java:
39)

Maybe I'm expecting the wrong behavior from the API.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



Re: [appengine-java] DeadlineExceededException from URLFetchService

2010-10-25 Thread Don Schwarz
DeadlineExceededException isn't appropriate, as that exception
indicates that your overall request deadline has passed and you need
to finish what you are doing and return from the request.

java.io.IOException seems correct for this.  Are you just trying to
distinguish timeouts from other types of connection errors (e.g. the
remote server is slow vs. the remote server is down completely) ?  I
don't know how reliably we can tell the difference.

-- Don

On Mon, Oct 25, 2010 at 3:05 PM, hector hrov...@gmail.com wrote:
 Would it be possible to throw DeadlineExceededException from the
 URLFetchService instead of IOException?

 java.io.IOException: Timeout while fetching: https://...
        at
 com.google.appengine.api.urlfetch.URLFetchServiceImpl.convertApplicationException(URLFetchServiceImpl.java:
 108)
        at
 com.google.appengine.api.urlfetch.URLFetchServiceImpl.fetch(URLFetchServiceImpl.java:
 39)

 Maybe I'm expecting the wrong behavior from the API.

 --
 You received this message because you are subscribed to the Google Groups 
 Google App Engine for Java group.
 To post to this group, send email to google-appengine-j...@googlegroups.com.
 To unsubscribe from this group, send email to 
 google-appengine-java+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/google-appengine-java?hl=en.



-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



Re: [appengine-java] Re: Create entity and get ID

2010-10-25 Thread A. Stevko
Can't hurt...
I define transactions for all my datastore access paths.

Thats the only way I can keep track of which entity is in context.
pm .currentTransaction.begin()
pm .currentTransaction.commit()

I'm also able to rollback() database access and not have the overhead of a
commit cleanup.
pm .currentTransaction.rollback()


On Mon, Oct 25, 2010 at 3:45 PM, Cosmin Stefan 
cosmin.stefandob...@gmail.com wrote:

 Hey,

 The thing is I don't have a transaction. And, I also am using
 IdentityType.Datastore, as I need the id to be generated by the
 datastore.

 Anyway, this is what I currently have (and doesn't properly work):

 @PersistenceCapable(detachable=true)
 @Embeddable
 public class GMeeting
 {
 ...
 /** The id. */
@PrimaryKey
@Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
private Long id;
 ...}

 and the using code:
 {
 
 meeting=new Meeting();
//Adding the new meeting
pm.makePersistent(meeting);
meeting=pm.detachCopy(meeting);
Long meetingID=meeting.getId();
 }

 Anyway, this still doesn't work. Do I need to use a transaction?

 Thanks,
 Cosmin


 On Oct 23, 2:09 am, A. Stevko andy.ste...@gmail.com wrote:
  In order to make the object accessible after the commit() you need to
 make
  it detatchable.
 
  @PersistenceCapable(identityType = IdentityType.APPLICATION,
  detachable=true)
  public class Meeting
 
  Then used this...
  pm.makePersistent(meeting);
  Long meetingID=meeting.getId();
 
  On Fri, Oct 22, 2010 at 3:19 PM, Cosmin Stefan 
 
  cosmin.stefandob...@gmail.com wrote:
   And I really want to emphasize that this doesn't happen EVERY time.
 
   Thanks and sorry for replying again!
 
   On Oct 23, 1:17 am, Cosmin Stefan cosmin.stefandob...@gmail.com
   wrote:
Hey,
 
Unfortunately, the first solution you suggested does not work and I
don't really need a transaction for what I want.
 
The thing is that I somehow need to persist the new entity and find
out it's key. How can I do this? Does anybody have an ideea?
 
Thanks a lot,
Cosmin
 
On Oct 22, 9:18 am, Ian Marshall ianmarshall...@gmail.com wrote:
 
 Have you tried replacing
 
   pm.makePersistent(meeting);
   Long meetingID=meeting.getId();
 
 with
 
   meeting = pm.makePersistent(meeting);
   Long meetingID=meeting.getId();
 
 and see if this solves your problem. You might also want to commit
 your transaction if you are doing this persistence in a transaction
 before testing your meeting's ID (the ID might not get allocated
 until
 transaction committal).
 
 On Oct 20, 11:19 pm, Cosmin Stefan cosmin.stefandob...@gmail.com
 wrote:
 
  Hey,
 
  I have encountered a weird case while trying to create a new
 entity
   in
  the database:
 
  I have this code:
 
  Meeting meeting=new Meeting(...)
  pm.makePersistent(meeting);
  Long meetingID=meeting.getId();
 
  if(meetingID==null)
  throw new Exception(meetingID is null...);
 
  and in the Meeting Class:
  /** The id. */
 @PrimaryKey
 @Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
 private Long id;
  so the id is Long.
 
  The problem is that sometimes the above code throws an exception,
   some
  other times it doesn't... What am I doing wrong?
 
  Thanks!
 
   --
   You received this message because you are subscribed to the Google
 Groups
   Google App Engine for Java group.
   To post to this group, send email to
   google-appengine-j...@googlegroups.com.
   To unsubscribe from this group, send email to
   google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.com
 google-appengine-java%2bunsubscr...@googlegroups.comgoogle-appengine-java%252bunsubscr...@googlegroups.com
 
   .
   For more options, visit this group at
  http://groups.google.com/group/google-appengine-java?hl=en.

 --
 You received this message because you are subscribed to the Google Groups
 Google App Engine for Java group.
 To post to this group, send email to
 google-appengine-j...@googlegroups.com.
 To unsubscribe from this group, send email to
 google-appengine-java+unsubscr...@googlegroups.comgoogle-appengine-java%2bunsubscr...@googlegroups.com
 .
 For more options, visit this group at
 http://groups.google.com/group/google-appengine-java?hl=en.



-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



Re: [appengine-java] Re: Recurring tasks

2010-10-25 Thread Vik
Hie

Ok thanks.. heard this word Objectify for the first time Not sure how
much will be the efforts to migrate current jdo implementation to this. any
advise ?


Thankx and Regards

Vik
Founder
www.sakshum.com
www.sakshum.blogspot.com


On Mon, Oct 25, 2010 at 8:24 PM, Didier Durand durand.did...@gmail.comwrote:

 Hi Vik,

 To be fully transparent, I stopped using JDO and switched to
 Objectify: much more simple, efficient  transparent than JDO (just
 check how many issues on this forum with jdo). Learning curve is very
 quick.

 Wih Objectify, if your entities are not too big in size and well
 indexed to match your queries, you can hope for many tenths of task
 start (each with its own donor to process) in the course of 30s. I am
 probably conservative.

 The advantage of starting additional tasks via API is parallelism: all
 your donors will be processed simultaneously and then update job is
 gonna be finished much faster overall. Be careful though if you have
 global counters updated by many tasks in parallel: you may enter
 concurrency race issues. But you can post again by then if needed to
 solve those new issues...

 regards
 didier

 On Oct 25, 2:45 pm, Vik vik@gmail.com wrote:
  Hie
 
  thanks a lot!!!
 
  So, how about starting a corn job every day say at 00:00  and if time
  consumption goes more than 30 sec than scheduling tasks to complete
  leftovers.
 
  I have another question here: The data model for this is:
 
  BloodDonor_name, blood_group, is_active, active_date
 
  so query to find donors to be activated ids gonna be:
 
  select BloodDonor  where is_active == false  active_date ==  today's
  date
 
  and update gonna be setting such donors:  is_active=true and active_date
 =
  null.
 
  Any idea how many such entries is safe to consider will be processed
 within
  30 secs ?
 
  Thankx and Regards
 
  Vik
  Founderwww.sakshum.comwww.sakshum.blogspot.com
 
  On Mon, Oct 25, 2010 at 5:53 PM, Didier Durand durand.did...@gmail.com
 wrote:
 
   Hi Vik,
 
   I would also go with cron  job in your case.
 
   Concerning the 30s limit, it depends on your requirements in
   computing:
 
   a) you can and should probably schedule via api an independent  task
   for each separate donor who needs to be processed after your query
   (via cron) told you who needs to be processed today.
 
   b) then,for each individual task, if it lasts more than 30s then you
   need to do what I described earlier (i.e pause the current run after
   storing its context and restart it in a new task)
 
   regards
   didier
 
   On Oct 25, 12:35 pm, Vik vik@gmail.com wrote:
Hie
 
Hmm, so in my case I always know in advance that I need to activate
 few
blood donors every day  which were deactivated yesterday.
 
So, this qualifies for cron jobs. Isn't it ? However, does this 30s
 limit
applies here as well?
 
Thankx and Regards
 
Vik
Founderwww.sakshum.comwww.sakshum.blogspot.com
 
On Mon, Oct 25, 2010 at 11:56 AM, Didier Durand 
 durand.did...@gmail.com
   wrote:
 
 Hi,
 
 I see different purposes for the 2:
 
 - cron tasks are the tasks I always want to do for sure
 - tasks via API are tasks that I schedule programmatically when
 needed
 and triggered by event that I can't predict in advance.
 
 I see personally 2 other purposes for API tasks:
  a) when I run a task and it comes close to the 30s limit, I pause
 it, serialize its context and schedule a task with this context.
 When
 the task starts, it's in fact a restart with another 30s credit for
 running
  b) I also tasks in context of transactions if I want to be sure
 that
 the action of the task is done (i.e datastore writes) even if the
 transactions fails  - for example, store errors in datastore for
 later analysis for a failing transaction / program, etc.
 
 regards
 didier
 
 On Oct 25, 6:23 am, Vik vik@gmail.com wrote:
  Hie Thanks for the response.
 
  I am confused a bit. If cron job does the scheduling then what
 Task
   Queue
  does?
 
  Thankx and Regards
 
  Vik
  Founderwww.sakshum.comwww.sakshum.blogspot.com
 
  On Mon, Oct 25, 2010 at 8:51 AM, Didier Durand 
   durand.did...@gmail.com
 wrote:
 
   Hi Vik,
 
   Tasks scheduled via cron.xml is the way I would go:
  http://code.google.com/appengine/docs/java/config/cron.html
 
   I would schedule a task every minute, make a query on the
 deadline
   for
   donors and then do what has to be done.
 
   didier
 
   On Oct 24, 7:11 pm, Vik vik@gmail.com wrote:
Hie Guys
 
Our application manages a list of blood donors. Time to time
   these
 blood
donors are unreachable so the administrators can mark such
 blood
 donors
   as
inactive.
 
However, these blood donors should be active again after 1
 day
automatically. How should we achieve this? ?In regular J2EE
 apis
   

[appengine-java] How to increase the number of indexes

2010-10-25 Thread GS
Hi,
   I currently used 185 out of 200 indexes(billed account). Is there a
way I can increase max indexes from 200 to a higher value?

Thanks in advance

George

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: Recurring tasks

2010-10-25 Thread Didier Durand
Hi Vik,

Hard to tell without knowing your app and its size, complexity

What I would suggest is to read the well done docs and practice a
little bit (probably less than 10 hours total) and then you can
decide.

I personally found the Objectify verbs very close to those of jdo so
migrating was more of syntax changes than full redesign of my code.

regards
didier

On Oct 26, 2:02 am, Vik vik@gmail.com wrote:
 Hie

 Ok thanks.. heard this word Objectify for the first time Not sure how
 much will be the efforts to migrate current jdo implementation to this. any
 advise ?

 Thankx and Regards

 Vik
 Founderwww.sakshum.comwww.sakshum.blogspot.com

 On Mon, Oct 25, 2010 at 8:24 PM, Didier Durand durand.did...@gmail.comwrote:

  Hi Vik,

  To be fully transparent, I stopped using JDO and switched to
  Objectify: much more simple, efficient  transparent than JDO (just
  check how many issues on this forum with jdo). Learning curve is very
  quick.

  Wih Objectify, if your entities are not too big in size and well
  indexed to match your queries, you can hope for many tenths of task
  start (each with its own donor to process) in the course of 30s. I am
  probably conservative.

  The advantage of starting additional tasks via API is parallelism: all
  your donors will be processed simultaneously and then update job is
  gonna be finished much faster overall. Be careful though if you have
  global counters updated by many tasks in parallel: you may enter
  concurrency race issues. But you can post again by then if needed to
  solve those new issues...

  regards
  didier

  On Oct 25, 2:45 pm, Vik vik@gmail.com wrote:
   Hie

   thanks a lot!!!

   So, how about starting a corn job every day say at 00:00  and if time
   consumption goes more than 30 sec than scheduling tasks to complete
   leftovers.

   I have another question here: The data model for this is:

   BloodDonor_name, blood_group, is_active, active_date

   so query to find donors to be activated ids gonna be:

   select BloodDonor  where is_active == false  active_date ==  today's
   date

   and update gonna be setting such donors:  is_active=true and active_date
  =
   null.

   Any idea how many such entries is safe to consider will be processed
  within
   30 secs ?

   Thankx and Regards

   Vik
   Founderwww.sakshum.comwww.sakshum.blogspot.com

   On Mon, Oct 25, 2010 at 5:53 PM, Didier Durand durand.did...@gmail.com
  wrote:

Hi Vik,

I would also go with cron  job in your case.

Concerning the 30s limit, it depends on your requirements in
computing:

a) you can and should probably schedule via api an independent  task
for each separate donor who needs to be processed after your query
(via cron) told you who needs to be processed today.

b) then,for each individual task, if it lasts more than 30s then you
need to do what I described earlier (i.e pause the current run after
storing its context and restart it in a new task)

regards
didier

On Oct 25, 12:35 pm, Vik vik@gmail.com wrote:
 Hie

 Hmm, so in my case I always know in advance that I need to activate
  few
 blood donors every day  which were deactivated yesterday.

 So, this qualifies for cron jobs. Isn't it ? However, does this 30s
  limit
 applies here as well?

 Thankx and Regards

 Vik
 Founderwww.sakshum.comwww.sakshum.blogspot.com

 On Mon, Oct 25, 2010 at 11:56 AM, Didier Durand 
  durand.did...@gmail.com
wrote:

  Hi,

  I see different purposes for the 2:

  - cron tasks are the tasks I always want to do for sure
  - tasks via API are tasks that I schedule programmatically when
  needed
  and triggered by event that I can't predict in advance.

  I see personally 2 other purposes for API tasks:
   a) when I run a task and it comes close to the 30s limit, I pause
  it, serialize its context and schedule a task with this context.
  When
  the task starts, it's in fact a restart with another 30s credit for
  running
   b) I also tasks in context of transactions if I want to be sure
  that
  the action of the task is done (i.e datastore writes) even if the
  transactions fails  - for example, store errors in datastore for
  later analysis for a failing transaction / program, etc.

  regards
  didier

  On Oct 25, 6:23 am, Vik vik@gmail.com wrote:
   Hie Thanks for the response.

   I am confused a bit. If cron job does the scheduling then what
  Task
Queue
   does?

   Thankx and Regards

   Vik
   Founderwww.sakshum.comwww.sakshum.blogspot.com

   On Mon, Oct 25, 2010 at 8:51 AM, Didier Durand 
durand.did...@gmail.com
  wrote:

Hi Vik,

Tasks scheduled via cron.xml is the way I would go:
   http://code.google.com/appengine/docs/java/config/cron.html

I would schedule a task every minute, make a query on the
  deadline
for

[appengine-java] recommendation on using maxmind geoip

2010-10-25 Thread asianCoolz
may i know what is the preferable way of using maxmind geoip? upload
the 'binary file' or bulk upload the csv files into bigtable?  the
prior technique will slow down application startup time?

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: DeadlineExceededException from URLFetchService

2010-10-25 Thread hector
That's correct, I'm using AppEngine to run a service registry that
proxies to other services.  I would like to differentiate between
services that are unavailable and services that are not responding
fast enough.

I've also noticed that if the service I'm calling from GAE is down, I
get an IOException, instead of a 404 in the response... which is what
I would get if I were calling the service from the browser.

On Oct 25, 1:22 pm, Don Schwarz schwa...@google.com wrote:
 DeadlineExceededException isn't appropriate, as that exception
 indicates that your overall request deadline has passed and you need
 to finish what you are doing and return from the request.

 java.io.IOException seems correct for this.  Are you just trying to
 distinguish timeouts from other types of connection errors (e.g. the
 remote server is slow vs. the remote server is down completely) ?  I
 don't know how reliably we can tell the difference.

 -- Don

 On Mon, Oct 25, 2010 at 3:05 PM, hector hrov...@gmail.com wrote:
  Would it be possible to throw DeadlineExceededException from the
  URLFetchService instead of IOException?

  java.io.IOException: Timeout while fetching: https://...
         at
  com.google.appengine.api.urlfetch.URLFetchServiceImpl.convertApplicationException(URLFetchServiceImpl.java:
  108)
         at
  com.google.appengine.api.urlfetch.URLFetchServiceImpl.fetch(URLFetchServiceImpl.java:
  39)

  Maybe I'm expecting the wrong behavior from the API.

  --
  You received this message because you are subscribed to the Google Groups 
  Google App Engine for Java group.
  To post to this group, send email to google-appengine-j...@googlegroups.com.
  To unsubscribe from this group, send email to 
  google-appengine-java+unsubscr...@googlegroups.com.
  For more options, visit this group 
  athttp://groups.google.com/group/google-appengine-java?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.



[appengine-java] Re: SDK/Eclipse pains on Ubuntu

2010-10-25 Thread Ed Murphy
I use Windows and I have the same problem when I upgrade and sometimes
when I create a new workspace. (tried to use Ubuntu, but can't get my
Adobe Flex plugin and Google plugin to both play nice). It seems the
the plugin update changes the SDK version, but does not copy the new
jars; it leaves the old ones or sometimes none. My workaround is to
basically to make the plugin re-copy the SDK jars into the project. I
do this by going into preferences and choosing Google. I then change
the SDK to a different version. Sometimes I need to also change the
default SDK, then choose a different one. The idea is to get the plugin
to realize that is must copy in the SDK jars. Usually I have to make to
copy the _old_ jars in, then I can change to the new SDK and have it
copy the new jars in. Works after that.


Also, your workspace may just be corrupt. So, if the above does not
work, create a new workspace and populate it from your source control,
then do the above if necessary. Hope this helps.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine for Java group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.