> Do I need to recompile my application with 3.2 dependencies or application 
> compiled with 3.0.1 will work fine on 3.2 ?



Yes.



And here is How to compile conditionally for Apache Spark 3.1.x and Apache 
Spark >= 3.2.x



object XYZ {
  @enableIf(classpathMatches(".*spark-catalyst_2\\.\\d+-3\\.2\\..*".r))
  private def getFuncName(f: UnresolvedFunction): String = {
    // For Spark 3.2.x
    f.nameParts.last
  }
  
  @enableIf(classpathMatches(".*spark-catalyst_2\\.\\d+-3\\.1\\..*".r))
  private def getFuncName(f: UnresolvedFunction): String = {
    // For Spark 3.1.x
    f.name.funcName
  }
}



For more details, see 
https://github.com/ThoughtWorksInc/enableIf.scala#enable-different-code-for-apache-spark-31x-and-32x



---- On Fri, 2022-04-08 01:27:42 Pralabh Kumar <pralabhku...@gmail.com> wrote 
----



Hi spark community

I have quick question .I am planning to migrate from spark 3.0.1 to spark 3.2.



Do I need to recompile my application with 3.2 dependencies or application 
compiled with 3.0.1 will work fine on 3.2 ?





Regards

Pralabh kumar

Reply via email to