I use "MacOs", but "App Engine" uses "Linux". Maybe therefore is such 
different? "Node.js" version "v8.11.2" is same as on "App Engine".

On Wednesday, June 13, 2018 at 4:52:56 PM UTC+4, mr.e...@gmail.com wrote:
>
> Locally I've such values:
> { 
>   rss: 82931712,
>   heapTotal: 45592576,
>   heapUsed: 39920088,
>   external: 43999 
> }
>
> but deployed version has such values:
>
> { 
>   rss: 125652992, 
>   heapTotal: 52449280, 
>   heapUsed: 35882816, 
>   external: 60351
> } 
>
>
> Locally I tested so:
> 1. Build app (same step before deploying).
> 2. Move to directory where built app.
> 3. Run "npm install".
> 4. Run "NODE_ENV=production npm start" ("node --optimize_for_size 
> --max_old_space_size=100 --gc_interval=100 index.js"). This command also 
> run "Appengine" for starting app.
>
> Because of this memory overhead I can't use smallest instance class with 
> limit to 128Mb RAM. I often see in the logs:
> Exceeded soft private memory limit of 128 MB with 129 MB after servicing 4 
> requests total. Consider setting a larger instance class in app.yaml.
> After handling this request, the process that handled this request was 
> found to be using too much memory and was terminated. This is likely to 
> cause a new process to be used for the next request to your application. If 
> you see this message frequently, you may have a memory leak in your 
> application.
>
> Why deployed version has such RAM overhead?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/509b6cb3-6768-4972-a6f0-801b5460a67a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to