Re: vm.swappiness value for Spark on Kubernetes

2021-02-16 Thread Sean Owen
You probably don't want swapping in any environment. Some tasks will grind to a halt under mem pressure rather than just fail quickly. You would want to simply provision more memory. On Tue, Feb 16, 2021, 7:57 AM Jahar Tyagi wrote: > Hi, > > We have recently migrated from Spark 2.4.4 to Spark

vm.swappiness value for Spark on Kubernetes

2021-02-16 Thread Jahar Tyagi
Hi, We have recently migrated from Spark 2.4.4 to Spark 3.0.1 and using Spark in virtual machine/bare metal as standalone deployment and as kubernetes deployment as well. There is a kernel parameter named as 'vm.swappiness' and we keep its value as '1' in standard deployment. Now since we are