Oh, I got it. I thought SPARK can get local scala version.
----- 原始邮件 -----
发件人:Sean Owen <sro...@gmail.com>
收件人:ckgppl_...@sina.cn
抄送人:user <user@spark.apache.org>
主题:Re: Spark got incorrect scala version while using spark 3.2.1 and spark 3.2.2
日期:2022年08月26日 21点08分

Spark is built with and ships with a copy of Scala. It doesn't use your local 
version.
On Fri, Aug 26, 2022 at 2:55 AM <ckgppl_...@sina.cn> wrote:
Hi all,
I found a strange thing. I have run SPARK 3.2.1 prebuilt in local mode. My OS 
scala version is 2.13.7.But when I run  spark-sumit then check the SparkUI, the 
web page shown that my scala version is 2.13.5.I used spark-shell, it also 
shown that my scala version is 2.13.5.Then I tried SPARK 3.2.2, it also shown 
that my scala version is 2.13.5.I checked the codes, it seems that SparkEnv got 
scala version from "scala.util.Properties.versionString".Not sure why it shown 
different scala version. Is it a bug or not?
Thanks
Liang

Reply via email to