Hello All, Currently our Batch ETL Jobs are in Spark 1.6.0 and planning to upgrade into Spark 2.1.0.
With minor code changes (like configuration and Spark Session.sc) able to execute the existing JOB into Spark 2.1.0. But noticed that JOB completion timings are much better in Spark 1.6.0 but no in Spark 2.1.0. For the instance, JOB A completed in 50s in Spark 1.6.0. And with the same input and JOB A completed in 1.5 mins in Spark 2.1.0. Is there any specific factor needs to be considered when switching to Spark 2.1.0 from Spark 1.6.0. Thanks & Regards, Gokula Krishnan* (Gokul)*