Nuno Azevedo created SPARK-25552:
------------------------------------

             Summary: Upgrade from Spark 1.6.3 to 2.3.0 seems to make jobs use 
about 50% more memory
                 Key: SPARK-25552
                 URL: https://issues.apache.org/jira/browse/SPARK-25552
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.3.0
         Environment: AWS Kubernetes

Spark Embedded
            Reporter: Nuno Azevedo


After upgrading from Spark 1.6.3 to 2.3.0 our jobs started to need about 50% 
more memory to run.

 

For instance, before we were running a job with Spark 1.6.3 and it was running 
fine with 50 GB of memory.

!image-2018-09-27-11-00-28-697.png|width=580,height=330!

 

After upgrading to Spark 2.3.0, when running the same job again with the same 
50 GB of memory it failed due to out of memory.

!image-2018-09-27-11-02-52-164.png|width=580,height=265!

 

Then, we started incrementing the memory until we were able to run the job, 
which was with 70 GB.

!image-2018-09-27-11-04-06-484.png|width=580,height=265!

 

The Spark upgrade was the only change in our environment. After taking a look 
at what seems to be causing this we noticed that Kryo Serializer is the main 
culprit for the raise in memory consumption.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to