Ola,

You probably have a default limit set on your installation or project, or you have a limit set in your deployment that is 512MB memory (which is536,870,912 bytes) You could set the memory limit higher on your deployment, please take a look at the following doc for more information and examples : https://docs.okd.io/latest/dev_guide/compute_resources.html

With Regards,
--
Ferry Manders


Ricardo Mendes wrote on 07/06/2019 14:15:

Hi all,

I am fairly new to OpenShift and I’ve been fiddling around this past week.

Yesterday I came across the following issue:

While deploying an image of Cacti (Open-Source monitoring and RRD) I got an error of exhausted memory.

**

*Fatal error*: Allowed memory size of 536870912 bytes exhausted (tried to allocate 20480 bytes) in */data/www/cacti/lib/database.php* on line *252*

My servers memory usage is:

Master: 70% in use

Infra: 35% in use

Compute/node: 20%

I can also confirm the POD is running on the node, and Cacti has a low memory footprint actually, so, do I have to further increase the memory available on the masters??

Thanks!

Best regards,

Ricardo Mendes



_______________________________________________
users mailing list
users@lists.openshift.redhat.com
http://lists.openshift.redhat.com/openshiftmm/listinfo/users

_______________________________________________
users mailing list
users@lists.openshift.redhat.com
http://lists.openshift.redhat.com/openshiftmm/listinfo/users

Reply via email to