Re: [google-appengine] Re: Handing of the multiple task at a same time

2016-03-20 Thread Hemanth Kumar
The error is like process is already taken by some other request, like that
its coming

On Mon, Mar 21, 2016 at 11:42 AM, Nickolas Daskalou 
wrote:

> Why are most of the tasks failing? What is the most common error when a
> task fails?
>
>
> On 21 March 2016 at 16:40, Hemanth Kumar 
> wrote:
>
>> HI Nick
>>
>> My queue configuration is like this
>>
>> 
>> appannie-queue
>> 60/s
>> 100
>> 50
>> 
>> 0
>> 
>> 
>>
>> But with this, the problem is maximum of the tasks are getting failed and
>> the number of records which I am getting in the big query is very less its
>> about 30 Lakh instead of 240 lakh. Not able to figure out how  to
>> resolve this.
>>
>> On Monday, March 21, 2016 at 10:51:38 AM UTC+5:30, Nickolas Daskalou
>> wrote:
>>>
>>> Hi Hemanth,
>>>
>>> Is your task queue set up to allow enough concurrency and/or execution
>>> rate?
>>>
>>> See: https://cloud.google.com/appengine/docs/python/config/queue
>>>
>>> Nick
>>>
>>>
>>> On 21 March 2016 at 15:59, Hemanth Kumar  wrote:
>>>
 HI Nick,

This includes a call to the API getting the response and
 inserting into big query. For the batch insert I am using TableRow only,
 but at a time I can only form a batch of 3000 records, and then call it to
 the big query. As I had written earlier, we have split our whole logic into
 a sets of task. The total count of task is around 5000. And each task will
 give me 3000 records. To complete all the task, how to reduce the time
 complexity, its a biggest challenge for me.

 Earlier , before breaking the job into multiple task , I created only
 one task, but to execute the whole task it was taking a long time because
 it almost 5000 times we have to hit the rest api to get the response. So in
 that case timeout exception was coming, to overcome this we splitted this
 whole job into 5000 task. Now , its not throwing the timeout exception, but
 the execution of all these task is taking a long time.

 Please suggest me how to overcome of this problem. It will be very
 useful for me.



 On Saturday, March 19, 2016 at 12:56:50 AM UTC+5:30, Nick (Cloud
 Platform Support) wrote:
>
> Hey Hemanth,
>
> There remain, after your post, some questions as to how you've
> implemented your system. The rate of inserts according to 47,000 / hour is
> approximately 13 / second. That seems very low - the maximum rate of
> inserts for streaming inserts is 100,000 rows per second, per table
> and 1,000,000 rows per second, per project.
> 
>
> Correct me if I'm wrong, but this starts to look like this number
> (47,000 in 1 hour) represents the overall time of the entire pipeline from
> API call to sending the BigQuery insert(), not simply the rate at which
> inserts could theoretically take place.
>
> Ultimately, the best way to insert BigQuery rows is not with isolated
> HTTP requests, which have a lot of overhead costs, but with batch inserts.
> It might be worth looking into ways that you could aggregate the records 
> in
> a layer after the task pipeline but before BigQuery which would allow you
> to send big batched inserts.
>
> Let me know your thoughts on this, and best wishes,
>
> Nick
> Cloud Platform Community Support
>
> On Friday, March 18, 2016 at 8:46:59 AM UTC-4, Hemanth Kumar wrote:
>>
>>
>> HI ALL
>>
>>Urgently required some help.
>>
>> I am facing a slowness problem while writing the data to the big
>> query.
>>
>> My problem is I have 5000 task , and each task is interacting with
>> the Rest Api. The rest api is giving me the JSON response. I am parsing 
>> the
>> JSON response, and the same response I have to write to big query.
>>
>> Every JSON response will give me around 3000 List of JSON Array.
>> If I calculate the total data which will get inserted in the google
>> big query is around 1500.
>> To insert 47000 record its taking 1 hour. If I have to insert lakhs
>> of record of data , my performance is getting hit.
>>
>> Please give a good suggestion to improve the time complexity for this
>> problem. How I will resolve this using Google App Engine,
>>
> --
 You received this message because you are subscribed to the Google
 Groups "Google App Engine" group.
 To unsubscribe from this group and stop receiving emails from it, send
 an email to google-appengi...@googlegroups.com.
 To post to this group, send email to google-a...@googlegroups.com.
 Visit this group at https://groups.google.com/group/google-appengine.
 To view this discussion on the web visit
 https://groups.google.com/d/msgid/google-appengine/8475603f-a249-407a-b180-2e5f4b51ed54%40googlegroups.com
 

Re: [google-appengine] Re: Handing of the multiple task at a same time

2016-03-20 Thread Nickolas Daskalou
Why are most of the tasks failing? What is the most common error when a
task fails?


On 21 March 2016 at 16:40, Hemanth Kumar  wrote:

> HI Nick
>
> My queue configuration is like this
>
> 
> appannie-queue
> 60/s
> 100
> 50
> 
> 0
> 
> 
>
> But with this, the problem is maximum of the tasks are getting failed and
> the number of records which I am getting in the big query is very less its
> about 30 Lakh instead of 240 lakh. Not able to figure out how  to
> resolve this.
>
> On Monday, March 21, 2016 at 10:51:38 AM UTC+5:30, Nickolas Daskalou wrote:
>>
>> Hi Hemanth,
>>
>> Is your task queue set up to allow enough concurrency and/or execution
>> rate?
>>
>> See: https://cloud.google.com/appengine/docs/python/config/queue
>>
>> Nick
>>
>>
>> On 21 March 2016 at 15:59, Hemanth Kumar  wrote:
>>
>>> HI Nick,
>>>
>>>This includes a call to the API getting the response and
>>> inserting into big query. For the batch insert I am using TableRow only,
>>> but at a time I can only form a batch of 3000 records, and then call it to
>>> the big query. As I had written earlier, we have split our whole logic into
>>> a sets of task. The total count of task is around 5000. And each task will
>>> give me 3000 records. To complete all the task, how to reduce the time
>>> complexity, its a biggest challenge for me.
>>>
>>> Earlier , before breaking the job into multiple task , I created only
>>> one task, but to execute the whole task it was taking a long time because
>>> it almost 5000 times we have to hit the rest api to get the response. So in
>>> that case timeout exception was coming, to overcome this we splitted this
>>> whole job into 5000 task. Now , its not throwing the timeout exception, but
>>> the execution of all these task is taking a long time.
>>>
>>> Please suggest me how to overcome of this problem. It will be very
>>> useful for me.
>>>
>>>
>>>
>>> On Saturday, March 19, 2016 at 12:56:50 AM UTC+5:30, Nick (Cloud
>>> Platform Support) wrote:

 Hey Hemanth,

 There remain, after your post, some questions as to how you've
 implemented your system. The rate of inserts according to 47,000 / hour is
 approximately 13 / second. That seems very low - the maximum rate of
 inserts for streaming inserts is 100,000 rows per second, per table
 and 1,000,000 rows per second, per project.
 

 Correct me if I'm wrong, but this starts to look like this number
 (47,000 in 1 hour) represents the overall time of the entire pipeline from
 API call to sending the BigQuery insert(), not simply the rate at which
 inserts could theoretically take place.

 Ultimately, the best way to insert BigQuery rows is not with isolated
 HTTP requests, which have a lot of overhead costs, but with batch inserts.
 It might be worth looking into ways that you could aggregate the records in
 a layer after the task pipeline but before BigQuery which would allow you
 to send big batched inserts.

 Let me know your thoughts on this, and best wishes,

 Nick
 Cloud Platform Community Support

 On Friday, March 18, 2016 at 8:46:59 AM UTC-4, Hemanth Kumar wrote:
>
>
> HI ALL
>
>Urgently required some help.
>
> I am facing a slowness problem while writing the data to the big query.
>
> My problem is I have 5000 task , and each task is interacting with the
> Rest Api. The rest api is giving me the JSON response. I am parsing the
> JSON response, and the same response I have to write to big query.
>
> Every JSON response will give me around 3000 List of JSON Array.
> If I calculate the total data which will get inserted in the google
> big query is around 1500.
> To insert 47000 record its taking 1 hour. If I have to insert lakhs of
> record of data , my performance is getting hit.
>
> Please give a good suggestion to improve the time complexity for this
> problem. How I will resolve this using Google App Engine,
>
 --
>>> You received this message because you are subscribed to the Google
>>> Groups "Google App Engine" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to google-appengi...@googlegroups.com.
>>> To post to this group, send email to google-a...@googlegroups.com.
>>> Visit this group at https://groups.google.com/group/google-appengine.
>>> To view this discussion on the web visit
>>> https://groups.google.com/d/msgid/google-appengine/8475603f-a249-407a-b180-2e5f4b51ed54%40googlegroups.com
>>> 
>>> .
>>>
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" 

Re: [google-appengine] Re: Handing of the multiple task at a same time

2016-03-20 Thread Hemanth Kumar
HI Nick

My queue configuration is like this


appannie-queue
60/s
100
50

0



But with this, the problem is maximum of the tasks are getting failed and 
the number of records which I am getting in the big query is very less its 
about 30 Lakh instead of 240 lakh. Not able to figure out how  to 
resolve this.

On Monday, March 21, 2016 at 10:51:38 AM UTC+5:30, Nickolas Daskalou wrote:
>
> Hi Hemanth,
>
> Is your task queue set up to allow enough concurrency and/or execution 
> rate?
>
> See: https://cloud.google.com/appengine/docs/python/config/queue
>
> Nick
>
>
> On 21 March 2016 at 15:59, Hemanth Kumar  > wrote:
>
>> HI Nick,
>>
>>This includes a call to the API getting the response and inserting 
>> into big query. For the batch insert I am using TableRow only, but at a 
>> time I can only form a batch of 3000 records, and then call it to the big 
>> query. As I had written earlier, we have split our whole logic into a sets 
>> of task. The total count of task is around 5000. And each task will give me 
>> 3000 records. To complete all the task, how to reduce the time complexity, 
>> its a biggest challenge for me.
>>
>> Earlier , before breaking the job into multiple task , I created only one 
>> task, but to execute the whole task it was taking a long time because it 
>> almost 5000 times we have to hit the rest api to get the response. So in 
>> that case timeout exception was coming, to overcome this we splitted this 
>> whole job into 5000 task. Now , its not throwing the timeout exception, but 
>> the execution of all these task is taking a long time. 
>>
>> Please suggest me how to overcome of this problem. It will be very useful 
>> for me.
>>
>>
>>
>> On Saturday, March 19, 2016 at 12:56:50 AM UTC+5:30, Nick (Cloud Platform 
>> Support) wrote:
>>>
>>> Hey Hemanth,
>>>
>>> There remain, after your post, some questions as to how you've 
>>> implemented your system. The rate of inserts according to 47,000 / hour is 
>>> approximately 13 / second. That seems very low - the maximum rate of 
>>> inserts for streaming inserts is 100,000 rows per second, per table and 
>>> 1,000,000 rows per second, per project. 
>>>  
>>>
>>> Correct me if I'm wrong, but this starts to look like this number 
>>> (47,000 in 1 hour) represents the overall time of the entire pipeline from 
>>> API call to sending the BigQuery insert(), not simply the rate at which 
>>> inserts could theoretically take place. 
>>>
>>> Ultimately, the best way to insert BigQuery rows is not with isolated 
>>> HTTP requests, which have a lot of overhead costs, but with batch inserts. 
>>> It might be worth looking into ways that you could aggregate the records in 
>>> a layer after the task pipeline but before BigQuery which would allow you 
>>> to send big batched inserts.
>>>
>>> Let me know your thoughts on this, and best wishes,
>>>
>>> Nick
>>> Cloud Platform Community Support 
>>>
>>> On Friday, March 18, 2016 at 8:46:59 AM UTC-4, Hemanth Kumar wrote:


 HI ALL

Urgently required some help.

 I am facing a slowness problem while writing the data to the big query.

 My problem is I have 5000 task , and each task is interacting with the 
 Rest Api. The rest api is giving me the JSON response. I am parsing the 
 JSON response, and the same response I have to write to big query. 

 Every JSON response will give me around 3000 List of JSON Array.
 If I calculate the total data which will get inserted in the google big 
 query is around 1500. 
 To insert 47000 record its taking 1 hour. If I have to insert lakhs of 
 record of data , my performance is getting hit.

 Please give a good suggestion to improve the time complexity for this 
 problem. How I will resolve this using Google App Engine,

>>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Google App Engine" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to google-appengi...@googlegroups.com .
>> To post to this group, send email to google-a...@googlegroups.com 
>> .
>> Visit this group at https://groups.google.com/group/google-appengine.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/google-appengine/8475603f-a249-407a-b180-2e5f4b51ed54%40googlegroups.com
>>  
>> 
>> .
>>
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit t

Re: [google-appengine] Re: Handing of the multiple task at a same time

2016-03-20 Thread Nickolas Daskalou
Hi Hemanth,

Is your task queue set up to allow enough concurrency and/or execution rate?

See: https://cloud.google.com/appengine/docs/python/config/queue

Nick


On 21 March 2016 at 15:59, Hemanth Kumar  wrote:

> HI Nick,
>
>This includes a call to the API getting the response and inserting
> into big query. For the batch insert I am using TableRow only, but at a
> time I can only form a batch of 3000 records, and then call it to the big
> query. As I had written earlier, we have split our whole logic into a sets
> of task. The total count of task is around 5000. And each task will give me
> 3000 records. To complete all the task, how to reduce the time complexity,
> its a biggest challenge for me.
>
> Earlier , before breaking the job into multiple task , I created only one
> task, but to execute the whole task it was taking a long time because it
> almost 5000 times we have to hit the rest api to get the response. So in
> that case timeout exception was coming, to overcome this we splitted this
> whole job into 5000 task. Now , its not throwing the timeout exception, but
> the execution of all these task is taking a long time.
>
> Please suggest me how to overcome of this problem. It will be very useful
> for me.
>
>
>
> On Saturday, March 19, 2016 at 12:56:50 AM UTC+5:30, Nick (Cloud Platform
> Support) wrote:
>>
>> Hey Hemanth,
>>
>> There remain, after your post, some questions as to how you've
>> implemented your system. The rate of inserts according to 47,000 / hour is
>> approximately 13 / second. That seems very low - the maximum rate of
>> inserts for streaming inserts is 100,000 rows per second, per table and
>> 1,000,000 rows per second, per project.
>> 
>>
>> Correct me if I'm wrong, but this starts to look like this number (47,000
>> in 1 hour) represents the overall time of the entire pipeline from API call
>> to sending the BigQuery insert(), not simply the rate at which inserts
>> could theoretically take place.
>>
>> Ultimately, the best way to insert BigQuery rows is not with isolated
>> HTTP requests, which have a lot of overhead costs, but with batch inserts.
>> It might be worth looking into ways that you could aggregate the records in
>> a layer after the task pipeline but before BigQuery which would allow you
>> to send big batched inserts.
>>
>> Let me know your thoughts on this, and best wishes,
>>
>> Nick
>> Cloud Platform Community Support
>>
>> On Friday, March 18, 2016 at 8:46:59 AM UTC-4, Hemanth Kumar wrote:
>>>
>>>
>>> HI ALL
>>>
>>>Urgently required some help.
>>>
>>> I am facing a slowness problem while writing the data to the big query.
>>>
>>> My problem is I have 5000 task , and each task is interacting with the
>>> Rest Api. The rest api is giving me the JSON response. I am parsing the
>>> JSON response, and the same response I have to write to big query.
>>>
>>> Every JSON response will give me around 3000 List of JSON Array.
>>> If I calculate the total data which will get inserted in the google big
>>> query is around 1500.
>>> To insert 47000 record its taking 1 hour. If I have to insert lakhs of
>>> record of data , my performance is getting hit.
>>>
>>> Please give a good suggestion to improve the time complexity for this
>>> problem. How I will resolve this using Google App Engine,
>>>
>> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to google-appengine+unsubscr...@googlegroups.com.
> To post to this group, send email to google-appengine@googlegroups.com.
> Visit this group at https://groups.google.com/group/google-appengine.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/google-appengine/8475603f-a249-407a-b180-2e5f4b51ed54%40googlegroups.com
> 
> .
>
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/CAOj3zuDm8UKJNSaWZLQ00LKD_BHS750zjG9kCvzvUc32RhVX%3Dw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.


[google-appengine] Re: Handing of the multiple task at a same time

2016-03-20 Thread Hemanth Kumar
HI Nick,

   This includes a call to the API getting the response and inserting 
into big query. For the batch insert I am using TableRow only, but at a 
time I can only form a batch of 3000 records, and then call it to the big 
query. As I had written earlier, we have split our whole logic into a sets 
of task. The total count of task is around 5000. And each task will give me 
3000 records. To complete all the task, how to reduce the time complexity, 
its a biggest challenge for me.

Earlier , before breaking the job into multiple task , I created only one 
task, but to execute the whole task it was taking a long time because it 
almost 5000 times we have to hit the rest api to get the response. So in 
that case timeout exception was coming, to overcome this we splitted this 
whole job into 5000 task. Now , its not throwing the timeout exception, but 
the execution of all these task is taking a long time. 

Please suggest me how to overcome of this problem. It will be very useful 
for me.



On Saturday, March 19, 2016 at 12:56:50 AM UTC+5:30, Nick (Cloud Platform 
Support) wrote:
>
> Hey Hemanth,
>
> There remain, after your post, some questions as to how you've implemented 
> your system. The rate of inserts according to 47,000 / hour is 
> approximately 13 / second. That seems very low - the maximum rate of 
> inserts for streaming inserts is 100,000 rows per second, per table and 
> 1,000,000 rows per second, per project. 
>  
>
> Correct me if I'm wrong, but this starts to look like this number (47,000 
> in 1 hour) represents the overall time of the entire pipeline from API call 
> to sending the BigQuery insert(), not simply the rate at which inserts 
> could theoretically take place. 
>
> Ultimately, the best way to insert BigQuery rows is not with isolated HTTP 
> requests, which have a lot of overhead costs, but with batch inserts. It 
> might be worth looking into ways that you could aggregate the records in a 
> layer after the task pipeline but before BigQuery which would allow you to 
> send big batched inserts.
>
> Let me know your thoughts on this, and best wishes,
>
> Nick
> Cloud Platform Community Support 
>
> On Friday, March 18, 2016 at 8:46:59 AM UTC-4, Hemanth Kumar wrote:
>>
>>
>> HI ALL
>>
>>Urgently required some help.
>>
>> I am facing a slowness problem while writing the data to the big query.
>>
>> My problem is I have 5000 task , and each task is interacting with the 
>> Rest Api. The rest api is giving me the JSON response. I am parsing the 
>> JSON response, and the same response I have to write to big query. 
>>
>> Every JSON response will give me around 3000 List of JSON Array.
>> If I calculate the total data which will get inserted in the google big 
>> query is around 1500. 
>> To insert 47000 record its taking 1 hour. If I have to insert lakhs of 
>> record of data , my performance is getting hit.
>>
>> Please give a good suggestion to improve the time complexity for this 
>> problem. How I will resolve this using Google App Engine,
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/8475603f-a249-407a-b180-2e5f4b51ed54%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[google-appengine] hi i have a working website in python web2py that run in Google App Engine Standard Environment i want to pass the site to Google App Engine VM

2016-03-20 Thread 'Jon Parrott' via Google App Engine
Hi, can you give us more details? Are you getting any error messages? What step 
are you stuck on?

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/30551f94-d7f3-4834-a00e-b19fc7eac9e0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[google-appengine] hi i have a working website in python web2py that run in Google App Engine Standard Environment i want to pass the site to Google App Engine VM

2016-03-20 Thread yar michl
hi i have a working website in python web2py that run in Google App 
Engine Standard Environment
i want to pass the site to Google App Engine VM so that i can use scipy 
package
can it been done,i did try  do it with the guide migrating-an-existing-app 

 but 
was unsuccessful


Thank for the help

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/08077f63-639f-443a-b557-7d30436423c7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[google-appengine] Re: New Cloud Console

2016-03-20 Thread johnP

Glad this issue is getting lots of attention.  Another major issue for me 
is the Instances viewer.  The original console made it easy to hit refresh 
to see how many requests each instance has served, and how many errors for 
each instance.

With the new console, I do not know how to refresh the view.  Hitting 
refresh does not help.  Exiting and re-entering the screen does not help. 
 Searching for another 30 minutes to see if I missed the refresh button on 
the screen does not help (am I missing it after all these hours of 
searching?  Am I really dyslexic?)  

In general, it seems like the original console was built by engineers who 
used app engine and the new console is built with engineers who were told 
to 'make it better'  but don't use app engine at all.  Some really basic 
stuff is missing. 




On Tuesday, February 23, 2016 at 3:46:19 PM UTC-8, johnP wrote:
>
> Just got email that old appengine console will be shut off in 6 weeks. 
>  The new cloud console does not have a "migrate traffic" option in the 
> Versions screen.  This was my absolute favorite new feature in a long, long 
> time.  Will it be available in the new console?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/2f066780-8d11-4c32-8c6a-a0c512677072%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[google-appengine] Re: Help a Noob create an endpoints project in eclipse

2016-03-20 Thread 'Zeehad (Cloud Platform Support)' via Google App Engine
Hello Jim,

It seems like this tutorial 

 
is what you may be looking for.

Cheers!

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/2e2dc49e-093c-4500-afb3-3f27b7dbdb8f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[google-appengine] Re: App Engine SDK to login with username and password while deploying

2016-03-20 Thread John Flint
The oath tokens are cached in a file .appcfg_oauth2_tokens in your profile 
directory. I just renamed this, logged on as a different user, which 
creates a new file. Subsequently I just rename the file to the target I 
want on the command line.  There is also a --oauth2_access_token= flag to 
the update command but I have not tried setting that. 

On Saturday, 19 March 2016 13:44:57 UTC+11, Dmitry V. wrote:
>
> In the old version of SDK you have to type e-mail and password each time 
> you upload the application. In the new version it stores your credentials 
> so it is not easy to swtich account. Each time have to press clear 
> deployment credential, login as another user on google, etc. Is there a way 
> to return old way of authorising? Or do it from command line?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/f1033953-8bd0-4f78-84cc-5eb1cbd4be78%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.