[spark] branch branch-3.5 updated: [SPARK-44510][UI] Update dataTables to 1.13.5 and remove some unreached png files
This is an automated email from the ASF dual-hosted git repository. yao pushed a commit to branch branch-3.5 in repository https://gitbox.apache.org/repos/asf/spark.git The following commit(s) were added to refs/heads/branch-3.5 by this push: new a8e2977fd9b [SPARK-44510][UI] Update dataTables to 1.13.5 and remove some unreached png files a8e2977fd9b is described below commit a8e2977fd9b6e3224b014e0b0572a4d7b83c1106 Author: Kent Yao AuthorDate: Sat Jul 22 13:11:04 2023 +0800 [SPARK-44510][UI] Update dataTables to 1.13.5 and remove some unreached png files ### What changes were proposed in this pull request? This PR updates datatables from 1.13.2 to 1.13.5, related license files, and removes some pictures for sorting orientation but unused. FYI, https://cdn.datatables.net/releases.html ### Why are the changes needed? updating web resources and cleanup ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? built and tested locally ![image](https://github.com/apache/spark/assets/8326978/87a899d2-a45a-4aba-8c65-ab694919923d) Closes #42108 from yaooqinn/SPARK-44510. Authored-by: Kent Yao Signed-off-by: Kent Yao (cherry picked from commit 6409284963d6c5edecc374db4027dee7f6f490c1) Signed-off-by: Kent Yao --- .../ui/static/dataTables.bootstrap4.1.13.2.min.css | 1 - .../ui/static/dataTables.bootstrap4.1.13.2.min.js | 4 .../ui/static/dataTables.bootstrap4.1.13.5.min.css | 1 + .../ui/static/dataTables.bootstrap4.1.13.5.min.js | 4 .../org/apache/spark/ui/static/images/sort_asc.png | Bin 160 -> 0 bytes .../spark/ui/static/images/sort_asc_disabled.png | Bin 148 -> 0 bytes .../apache/spark/ui/static/images/sort_both.png| Bin 201 -> 0 bytes .../apache/spark/ui/static/images/sort_desc.png| Bin 158 -> 0 bytes .../spark/ui/static/images/sort_desc_disabled.png | Bin 146 -> 0 bytes .../ui/static/jquery.dataTables.1.13.2.min.js | 4 .../ui/static/jquery.dataTables.1.13.5.min.js | 4 .../main/scala/org/apache/spark/ui/UIUtils.scala | 6 ++--- dev/.rat-excludes | 6 ++--- licenses-binary/LICENSE-datatables.txt | 25 + licenses/LICENSE-datatables.txt| 25 + 15 files changed, 57 insertions(+), 23 deletions(-) diff --git a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.css b/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.css deleted file mode 100644 index b9c16ca78a0..000 --- a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.css +++ /dev/null @@ -1 +0,0 @@ -:root{--dt-row-selected: 2, 117, 216;--dt-row-selected-text: 255, 255, 255;--dt-row-selected-link: 9, 10, 11}table.dataTable td.dt-control{text-align:center;cursor:pointer}table.dataTable td.dt-control:before{height:1em;width:1em;margin-top:-9px;display:inline-block;color:white;border:.15em solid white;border-radius:1em;box-shadow:0 0 .2em #444;box-sizing:content-box;text-align:center;text-indent:0 !important;font-family:"Courier New",Courier,monospace;line-height:1em;content:"+";backgro [...] diff --git a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.js b/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.js deleted file mode 100644 index 2937bc3c90c..000 --- a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.js +++ /dev/null @@ -1,4 +0,0 @@ -/*! DataTables Bootstrap 4 integration - * ©2011-2017 SpryMedia Ltd - datatables.net/license - */ -!function(t){"function"==typeof define&&define.amd?define(["jquery","datatables.net"],function(e){return t(e,window,document)}):"object"==typeof exports?module.exports=function(e,a){return e=e||window,(a=a||("undefined"!=typeof window?require("jquery"):require("jquery")(e))).fn.dataTable||require("datatables.net")(e,a),t(a,0,e.document)}:t(jQuery,window,document)}(function(x,e,n,r){"use strict";var s=x.fn.dataTable;return x.extend(!0,s.defaults,{dom:"<'row'<'col-sm-12 col-md-6'l><'col-sm [...] \ No newline at end of file diff --git a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.5.min.css b/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.5.min.css new file mode 100644 index 000..6db36f6e75d --- /dev/null +++ b/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.5.min.css @@ -0,0 +1 @@ +:root{--dt-row-selected: 2, 117, 216;--dt-row-selected-text: 255, 255, 255;--dt-row-selected-link: 9, 10, 11;--dt-row-stripe: 0, 0, 0;--dt-row-hover: 0, 0, 0;--dt-column-ordering: 0, 0, 0;--dt-html-background: white}:root.dark{--dt-html-background: rgb(33, 37, 41)}t
[spark] branch master updated: [SPARK-44510][UI] Update dataTables to 1.13.5 and remove some unreached png files
This is an automated email from the ASF dual-hosted git repository. yao pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git The following commit(s) were added to refs/heads/master by this push: new 6409284963d [SPARK-44510][UI] Update dataTables to 1.13.5 and remove some unreached png files 6409284963d is described below commit 6409284963d6c5edecc374db4027dee7f6f490c1 Author: Kent Yao AuthorDate: Sat Jul 22 13:11:04 2023 +0800 [SPARK-44510][UI] Update dataTables to 1.13.5 and remove some unreached png files ### What changes were proposed in this pull request? This PR updates datatables from 1.13.2 to 1.13.5, related license files, and removes some pictures for sorting orientation but unused. FYI, https://cdn.datatables.net/releases.html ### Why are the changes needed? updating web resources and cleanup ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? built and tested locally ![image](https://github.com/apache/spark/assets/8326978/87a899d2-a45a-4aba-8c65-ab694919923d) Closes #42108 from yaooqinn/SPARK-44510. Authored-by: Kent Yao Signed-off-by: Kent Yao --- .../ui/static/dataTables.bootstrap4.1.13.2.min.css | 1 - .../ui/static/dataTables.bootstrap4.1.13.2.min.js | 4 .../ui/static/dataTables.bootstrap4.1.13.5.min.css | 1 + .../ui/static/dataTables.bootstrap4.1.13.5.min.js | 4 .../org/apache/spark/ui/static/images/sort_asc.png | Bin 160 -> 0 bytes .../spark/ui/static/images/sort_asc_disabled.png | Bin 148 -> 0 bytes .../apache/spark/ui/static/images/sort_both.png| Bin 201 -> 0 bytes .../apache/spark/ui/static/images/sort_desc.png| Bin 158 -> 0 bytes .../spark/ui/static/images/sort_desc_disabled.png | Bin 146 -> 0 bytes .../ui/static/jquery.dataTables.1.13.2.min.js | 4 .../ui/static/jquery.dataTables.1.13.5.min.js | 4 .../main/scala/org/apache/spark/ui/UIUtils.scala | 6 ++--- dev/.rat-excludes | 6 ++--- licenses-binary/LICENSE-datatables.txt | 25 + licenses/LICENSE-datatables.txt| 25 + 15 files changed, 57 insertions(+), 23 deletions(-) diff --git a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.css b/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.css deleted file mode 100644 index b9c16ca78a0..000 --- a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.css +++ /dev/null @@ -1 +0,0 @@ -:root{--dt-row-selected: 2, 117, 216;--dt-row-selected-text: 255, 255, 255;--dt-row-selected-link: 9, 10, 11}table.dataTable td.dt-control{text-align:center;cursor:pointer}table.dataTable td.dt-control:before{height:1em;width:1em;margin-top:-9px;display:inline-block;color:white;border:.15em solid white;border-radius:1em;box-shadow:0 0 .2em #444;box-sizing:content-box;text-align:center;text-indent:0 !important;font-family:"Courier New",Courier,monospace;line-height:1em;content:"+";backgro [...] diff --git a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.js b/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.js deleted file mode 100644 index 2937bc3c90c..000 --- a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.2.min.js +++ /dev/null @@ -1,4 +0,0 @@ -/*! DataTables Bootstrap 4 integration - * ©2011-2017 SpryMedia Ltd - datatables.net/license - */ -!function(t){"function"==typeof define&&define.amd?define(["jquery","datatables.net"],function(e){return t(e,window,document)}):"object"==typeof exports?module.exports=function(e,a){return e=e||window,(a=a||("undefined"!=typeof window?require("jquery"):require("jquery")(e))).fn.dataTable||require("datatables.net")(e,a),t(a,0,e.document)}:t(jQuery,window,document)}(function(x,e,n,r){"use strict";var s=x.fn.dataTable;return x.extend(!0,s.defaults,{dom:"<'row'<'col-sm-12 col-md-6'l><'col-sm [...] \ No newline at end of file diff --git a/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.5.min.css b/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.5.min.css new file mode 100644 index 000..6db36f6e75d --- /dev/null +++ b/core/src/main/resources/org/apache/spark/ui/static/dataTables.bootstrap4.1.13.5.min.css @@ -0,0 +1 @@ +:root{--dt-row-selected: 2, 117, 216;--dt-row-selected-text: 255, 255, 255;--dt-row-selected-link: 9, 10, 11;--dt-row-stripe: 0, 0, 0;--dt-row-hover: 0, 0, 0;--dt-column-ordering: 0, 0, 0;--dt-html-background: white}:root.dark{--dt-html-background: rgb(33, 37, 41)}table.dataTable td.dt-control{text-align:center;cursor:pointer}table.dataTable td.dt-control:before{display:i
[spark] branch master updated: [SPARK-44341][SQL][PYTHON][FOLLOWUP] Move the base trait WindowEvaluatorFactoryBase to a single file
This is an automated email from the ASF dual-hosted git repository. wenchen pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git The following commit(s) were added to refs/heads/master by this push: new bdeae870674 [SPARK-44341][SQL][PYTHON][FOLLOWUP] Move the base trait WindowEvaluatorFactoryBase to a single file bdeae870674 is described below commit bdeae87067452bb41f4776c4ab444a9d9645fdfc Author: Jiaan Geng AuthorDate: Fri Jul 21 21:21:15 2023 +0800 [SPARK-44341][SQL][PYTHON][FOLLOWUP] Move the base trait WindowEvaluatorFactoryBase to a single file ### What changes were proposed in this pull request? https://github.com/apache/spark/pull/41939 defined the computing logic through PartitionEvaluator API and use it in `WindowExec` and `WindowInPandasExec`. According to the comment https://github.com/apache/spark/pull/41939#discussion_r1270194752, this PR want move the base trait `WindowEvaluatorFactoryBase` to a single file. ### Why are the changes needed? Improve the code. ### Does this PR introduce _any_ user-facing change? 'No'. Just update inner implementation. ### How was this patch tested? N/A Closes #42106 from beliefer/SPARK-44341_followup. Authored-by: Jiaan Geng Signed-off-by: Wenchen Fan --- .../execution/window/WindowEvaluatorFactory.scala | 268 + ...tory.scala => WindowEvaluatorFactoryBase.scala} | 130 +- 2 files changed, 2 insertions(+), 396 deletions(-) diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/window/WindowEvaluatorFactory.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/window/WindowEvaluatorFactory.scala index 913f8762c79..fb4ea7f35c0 100644 --- a/sql/core/src/main/scala/org/apache/spark/sql/execution/window/WindowEvaluatorFactory.scala +++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/window/WindowEvaluatorFactory.scala @@ -17,278 +17,12 @@ package org.apache.spark.sql.execution.window -import scala.collection.mutable -import scala.collection.mutable.ArrayBuffer - import org.apache.spark.{PartitionEvaluator, PartitionEvaluatorFactory} import org.apache.spark.sql.catalyst.InternalRow -import org.apache.spark.sql.catalyst.expressions.{Add, AggregateWindowFunction, Ascending, Attribute, BoundReference, CurrentRow, DateAdd, DateAddYMInterval, DecimalAddNoOverflowCheck, Descending, Expression, FrameLessOffsetWindowFunction, FrameType, IdentityProjection, IntegerLiteral, JoinedRow, MutableProjection, NamedExpression, OffsetWindowFunction, PythonFuncExpression, RangeFrame, RowFrame, RowOrdering, SortOrder, SpecificInternalRow, SpecifiedWindowFrame, TimeAdd, TimestampAddYMIn [...] -import org.apache.spark.sql.catalyst.expressions.aggregate.AggregateExpression +import org.apache.spark.sql.catalyst.expressions.{Attribute, Expression, JoinedRow, NamedExpression, SortOrder, SpecificInternalRow, UnsafeProjection, UnsafeRow} import org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray import org.apache.spark.sql.execution.metric.SQLMetric import org.apache.spark.sql.internal.SQLConf -import org.apache.spark.sql.types.{CalendarIntervalType, DateType, DayTimeIntervalType, DecimalType, IntegerType, TimestampNTZType, TimestampType, YearMonthIntervalType} -import org.apache.spark.util.collection.Utils - -trait WindowEvaluatorFactoryBase { - def windowExpression: Seq[NamedExpression] - def partitionSpec: Seq[Expression] - def orderSpec: Seq[SortOrder] - def childOutput: Seq[Attribute] - def spillSize: SQLMetric - - /** - * Create the resulting projection. - * - * This method uses Code Generation. It can only be used on the executor side. - * - * @param expressions unbound ordered function expressions. - * @return the final resulting projection. - */ - protected def createResultProjection(expressions: Seq[Expression]): UnsafeProjection = { -val references = expressions.zipWithIndex.map { case (e, i) => - // Results of window expressions will be on the right side of child's output - BoundReference(childOutput.size + i, e.dataType, e.nullable) -} -val unboundToRefMap = Utils.toMap(expressions, references) -val patchedWindowExpression = windowExpression.map(_.transform(unboundToRefMap)) -UnsafeProjection.create( - childOutput ++ patchedWindowExpression, - childOutput) - } - - /** - * Create a bound ordering object for a given frame type and offset. A bound ordering object is - * used to determine which input row lies within the frame boundaries of an output row. - * - * This method uses Code Generation. It can only be used on the executor side. - * - * @param frame to evaluate. This can either be a Row or Range frame. - * @param bound with respect to the row. - * @param timeZone the session local timezone for time related calculations. - * @return a bound orderi
[GitHub] [spark-website] HyukjinKwon commented on pull request #468: Fix the link of developer-tools docs
HyukjinKwon commented on PR #468: URL: https://github.com/apache/spark-website/pull/468#issuecomment-1645395205 Merged to ast-site. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
[GitHub] [spark-website] HyukjinKwon closed pull request #468: Fix the link of developer-tools docs
HyukjinKwon closed pull request #468: Fix the link of developer-tools docs URL: https://github.com/apache/spark-website/pull/468 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
[spark-website] branch asf-site updated: Fix the link of developer-tools docs
This is an automated email from the ASF dual-hosted git repository. gurwls223 pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/spark-website.git The following commit(s) were added to refs/heads/asf-site by this push: new 52374d27b1 Fix the link of developer-tools docs 52374d27b1 is described below commit 52374d27b1d7c67047310983a20c4ccb2296d141 Author: panbingkun AuthorDate: Fri Jul 21 19:56:33 2023 +0900 Fix the link of developer-tools docs The pr aims to update some outdated links in `developer-tools` docs, include: - https://www.scala-sbt.org/0.13/docs/Testing.html -> https://www.scala-sbt.org/1.x/docs/Testing.html Because Spark's dependency on the SBT version has been upgraded to 1. x for a long time - https://github.com/typesafehub/migration-manager -> https://github.com/lightbend/mima Avoid jumping - http://www.yourkit.com/docs/80/help/agent.jsp -> https://www.yourkit.com/docs/java/help/agent.jsp Dead link Author: panbingkun Closes #468 from panbingkun/update_doc. --- developer-tools.md| 6 +++--- site/developer-tools.html | 6 +++--- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/developer-tools.md b/developer-tools.md index fdbf88796e..e0a1844ae7 100644 --- a/developer-tools.md +++ b/developer-tools.md @@ -120,7 +120,7 @@ If you'd prefer, you can run all of these commands on the command line (but this $ build/sbt "core/testOnly *DAGSchedulerSuite -- -z SPARK-12345" ``` -For more about how to run individual tests with sbt, see the [sbt documentation](https://www.scala-sbt.org/0.13/docs/Testing.html). +For more about how to run individual tests with sbt, see the [sbt documentation](https://www.scala-sbt.org/1.x/docs/Testing.html). Testing with Maven @@ -242,7 +242,7 @@ build/sbt "testOnly org.apache.spark.rdd.SortingSuite" Binary compatibility -To ensure binary compatibility, Spark uses [MiMa](https://github.com/typesafehub/migration-manager). +To ensure binary compatibility, Spark uses [MiMa](https://github.com/lightbend/mima). Ensuring binary compatibility @@ -548,4 +548,4 @@ Please see the full YourKit documentation for the full list of profiler agent When running Spark tests through SBT, add `javaOptions in Test += "-agentpath:/path/to/yjp"` to `SparkBuild.scala` to launch the tests with the YourKit profiler agent enabled. The platform-specific paths to the profiler agents are listed in the -http://www.yourkit.com/docs/80/help/agent.jsp";>YourKit documentation. +https://www.yourkit.com/docs/java/help/agent.jsp";>YourKit documentation. diff --git a/site/developer-tools.html b/site/developer-tools.html index 4f178fc922..a43786ff91 100644 --- a/site/developer-tools.html +++ b/site/developer-tools.html @@ -233,7 +233,7 @@ $ build/mvn package -DskipTests -pl core $ build/sbt "core/testOnly *DAGSchedulerSuite -- -z SPARK-12345" -For more about how to run individual tests with sbt, see the https://www.scala-sbt.org/0.13/docs/Testing.html";>sbt documentation. +For more about how to run individual tests with sbt, see the https://www.scala-sbt.org/1.x/docs/Testing.html";>sbt documentation. Testing with Maven @@ -352,7 +352,7 @@ sufficient to run a test from the command line: Binary compatibility -To ensure binary compatibility, Spark uses https://github.com/typesafehub/migration-manager";>MiMa. +To ensure binary compatibility, Spark uses https://github.com/lightbend/mima";>MiMa. Ensuring binary compatibility @@ -655,7 +655,7 @@ cluster with the same name, your security group settings will be re-used. When running Spark tests through SBT, add javaOptions in Test += "-agentpath:/path/to/yjp" to SparkBuild.scala to launch the tests with the YourKit profiler agent enabled. The platform-specific paths to the profiler agents are listed in the -http://www.yourkit.com/docs/80/help/agent.jsp";>YourKit documentation. +https://www.yourkit.com/docs/java/help/agent.jsp";>YourKit documentation. - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
[GitHub] [spark-website] ulysses-you commented on pull request #467: Add Xiduo You to committers
ulysses-you commented on PR #467: URL: https://github.com/apache/spark-website/pull/467#issuecomment-1645240873 thank you @LuciferYang ! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
[GitHub] [spark-website] LuciferYang commented on pull request #468: Fix the link of developer-tools docs
LuciferYang commented on PR #468: URL: https://github.com/apache/spark-website/pull/468#issuecomment-1645221127 hmm... @panbingkun need update corresponding html file ... -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
[GitHub] [spark-website] panbingkun opened a new pull request, #468: Fix the link of developer-tools docs
panbingkun opened a new pull request, #468: URL: https://github.com/apache/spark-website/pull/468 The pr aims to update some outdated links in `developer-tools` docs, include: -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
[GitHub] [spark-website] LuciferYang commented on pull request #467: Add Xiduo You to committers
LuciferYang commented on PR #467: URL: https://github.com/apache/spark-website/pull/467#issuecomment-1645181971 Congrats ~ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
[spark] branch master updated: [SPARK-44506][BUILD] Upgrade mima-core & sbt-mima-plugin to 1.1.3
This is an automated email from the ASF dual-hosted git repository. yangjie01 pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git The following commit(s) were added to refs/heads/master by this push: new e6649ebfd44 [SPARK-44506][BUILD] Upgrade mima-core & sbt-mima-plugin to 1.1.3 e6649ebfd44 is described below commit e6649ebfd44f824ceabfcb472392e0bfca2d088f Author: panbingkun AuthorDate: Fri Jul 21 15:11:05 2023 +0800 [SPARK-44506][BUILD] Upgrade mima-core & sbt-mima-plugin to 1.1.3 ### What changes were proposed in this pull request? The pr aims to upgrade `mima-core` & `sbt-mima-plugin` from 1.1.2 to 1.1.3. ### Why are the changes needed? - Release note: https://github.com/lightbend/mima/releases/tag/1.1.3 - This version adapts to scala-2.12.18 & scala-2.13.11, which is also used by Spark currently Update scala-library, scala-reflect to 2.12.18 by scala-steward in https://github.com/lightbend/mima/pull/764 Update scala-library, scala-reflect to 2.13.11 by scala-steward in https://github.com/lightbend/mima/pull/765 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Pass GA. Closes #42102 from panbingkun/SPARK-44506. Authored-by: panbingkun Signed-off-by: yangjie01 --- pom.xml | 2 +- project/plugins.sbt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/pom.xml b/pom.xml index cf8a1cf5178..cb8e8864462 100644 --- a/pom.xml +++ b/pom.xml @@ -289,7 +289,7 @@ 32.0.1-jre 1.0.1 1.56.0 -1.1.2 +1.1.3 6.0.53 128m diff --git a/project/plugins.sbt b/project/plugins.sbt index ce4f7afb743..ba52a201e66 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -31,7 +31,7 @@ addSbtPlugin("com.github.sbt" % "sbt-eclipse" % "6.0.0") addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "1.0.0") -addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "1.1.2") +addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "1.1.3") addSbtPlugin("com.github.sbt" % "sbt-unidoc" % "0.5.0") - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org