[jira] [Updated] (SPARK-27911) PySpark Packages should automatically choose correct scala version

2020-03-17 Thread Dongjoon Hyun (Jira)
[ https://issues.apache.org/jira/browse/SPARK-27911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-27911: -- Affects Version/s: (was: 3.0.0) 3.1.0 > PySpark Packages should

[jira] [Updated] (SPARK-27911) PySpark Packages should automatically choose correct scala version

2019-07-16 Thread Dongjoon Hyun (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-27911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-27911: -- Affects Version/s: (was: 2.4.3) 3.0.0 > PySpark Packages should

[jira] [Updated] (SPARK-27911) PySpark Packages should automatically choose correct scala version

2019-05-31 Thread Michael Armbrust (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-27911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Michael Armbrust updated SPARK-27911: - Description: Today, users of pyspark (and Scala) need to manually specify the version