wuyi created SPARK-32984:
Summary: Improve showing the differences between approved and
actual plans
Key: SPARK-32984
URL: https://issues.apache.org/jira/browse/SPARK-32984
Project: Spark
Issue Type
[
https://issues.apache.org/jira/browse/SPARK-32937?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17199125#comment-17199125
]
wuyi commented on SPARK-32937:
--
I'm looking at it. Thanks for reporting!
> DecomissionSuit
[
https://issues.apache.org/jira/browse/SPARK-32913?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32913:
-
Summary: Improve ExecutorDecommissionInfo and ExecutorDecommissionState for
different use cases (was: Improve D
wuyi created SPARK-32913:
Summary: Improve DecommissionInfo and DecommissionState for
different use cases
Key: SPARK-32913
URL: https://issues.apache.org/jira/browse/SPARK-32913
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-32898?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17197341#comment-17197341
]
wuyi commented on SPARK-32898:
--
I think the issue is(for executorRunTimeMs): Before a task
[
https://issues.apache.org/jira/browse/SPARK-32898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32898:
-
Description:
This might be because of incorrectly calculating executorRunTimeMs in
Executor.scala
The function
wuyi created SPARK-32878:
Summary: Avoid scheduling TaskSetManager which has no pending tasks
Key: SPARK-32878
URL: https://issues.apache.org/jira/browse/SPARK-32878
Project: Spark
Issue Type: Improv
wuyi created SPARK-32857:
Summary: Flaky o.a.s.scheduler.BarrierTaskContextSuite.throw
exception if the number of barrier() calls are not the same on every task
Key: SPARK-32857
URL: https://issues.apache.org/jira/browse/
wuyi created SPARK-32850:
Summary: Simply the RPC message flow of decommission
Key: SPARK-32850
URL: https://issues.apache.org/jira/browse/SPARK-32850
Project: Spark
Issue Type: Sub-task
Co
wuyi created SPARK-32736:
Summary: Avoid caching the removed decommissioned executors in
TaskSchedulerImpl
Key: SPARK-32736
URL: https://issues.apache.org/jira/browse/SPARK-32736
Project: Spark
Issu
wuyi created SPARK-32717:
Summary: Add a AQEOptimizer for AdaptiveSparkPlanExec
Key: SPARK-32717
URL: https://issues.apache.org/jira/browse/SPARK-32717
Project: Spark
Issue Type: Improvement
wuyi created SPARK-32653:
Summary: Decommissioned host/executor should be considered as
inactive in TaskSchedulerImpl
Key: SPARK-32653
URL: https://issues.apache.org/jira/browse/SPARK-32653
Project: Spark
wuyi created SPARK-32651:
Summary: decommission switch configuration should have the highest
hierarchy
Key: SPARK-32651
URL: https://issues.apache.org/jira/browse/SPARK-32651
Project: Spark
Issue Ty
[
https://issues.apache.org/jira/browse/SPARK-32616?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32616:
-
Issue Type: Bug (was: Improvement)
> Window operators should be added determinedly
> --
wuyi created SPARK-32616:
Summary: Window operators should be added determinedly
Key: SPARK-32616
URL: https://issues.apache.org/jira/browse/SPARK-32616
Project: Spark
Issue Type: Improvement
[
https://issues.apache.org/jira/browse/SPARK-32600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32600:
-
Summary: Unify task name in some logs between driver and executor (was:
Unify task name in some places between
wuyi created SPARK-32600:
Summary: Unify task name in some places between driver and executor
Key: SPARK-32600
URL: https://issues.apache.org/jira/browse/SPARK-32600
Project: Spark
Issue Type: Improv
wuyi created SPARK-32518:
Summary: CoarseGrainedSchedulerBackend.maxNumConcurrentTasks
should consider all kinds of resources
Key: SPARK-32518
URL: https://issues.apache.org/jira/browse/SPARK-32518
Project: S
wuyi created SPARK-32466:
Summary: Add support to catch SparkPlan regression base on TPC-DS
queries
Key: SPARK-32466
URL: https://issues.apache.org/jira/browse/SPARK-32466
Project: Spark
Issue Type:
wuyi created SPARK-32459:
Summary: UDF regression of WrappedArray supporting caused by
SPARK-31826
Key: SPARK-32459
URL: https://issues.apache.org/jira/browse/SPARK-32459
Project: Spark
Issue Type:
wuyi created SPARK-32372:
Summary: "Resolved attribute(s) XXX missing" after dudup conflict
references
Key: SPARK-32372
URL: https://issues.apache.org/jira/browse/SPARK-32372
Project: Spark
Issue Ty
[
https://issues.apache.org/jira/browse/SPARK-32307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17159020#comment-17159020
]
wuyi commented on SPARK-32307:
--
Hi [~dongjoon] Please see my response here
https://github.
[
https://issues.apache.org/jira/browse/SPARK-32307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32307:
-
Description:
{code:java}
spark.udf.register("key", udf((m: Map[String, String]) => m.keys.head.toInt))
Seq(Map("
wuyi created SPARK-32307:
Summary: Aggression that use map type input UDF as group
expression can fail
Key: SPARK-32307
URL: https://issues.apache.org/jira/browse/SPARK-32307
Project: Spark
Issue Ty
[
https://issues.apache.org/jira/browse/SPARK-32287?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32287:
-
Component/s: Tests
> Flaky test: ExecutorAllocationManagerSuite.add executors default profile
>
[
https://issues.apache.org/jira/browse/SPARK-32287?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17156441#comment-17156441
]
wuyi commented on SPARK-32287:
--
cc [~tgraves] [~hyukjin.kwon]
> Flaky test: ExecutorAlloca
[
https://issues.apache.org/jira/browse/SPARK-32287?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32287:
-
Description:
This test becomes flaky in Github Actions, see:
https://github.com/apache/spark/pull/29072/checks
[
https://issues.apache.org/jira/browse/SPARK-32287?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32287:
-
Description:
This test becomes flaky in Github Actions, see:
https://github.com/apache/spark/pull/29072/checks
wuyi created SPARK-32287:
Summary: Flaky test: ExecutorAllocationManagerSuite.add executors
default profile
Key: SPARK-32287
URL: https://issues.apache.org/jira/browse/SPARK-32287
Project: Spark
Iss
[
https://issues.apache.org/jira/browse/SPARK-32250?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17155201#comment-17155201
]
wuyi commented on SPARK-32250:
--
[~hyukjin.kwon]Thanks for ping me. I'll take a look later.
[
https://issues.apache.org/jira/browse/SPARK-32238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32238:
-
Issue Type: Bug (was: Improvement)
> Use Utils.getSimpleName to avoid hitting Malformed class name in ScalaUDF
wuyi created SPARK-32238:
Summary: "Malformed class name" error from ScalaUDF
Key: SPARK-32238
URL: https://issues.apache.org/jira/browse/SPARK-32238
Project: Spark
Issue Type: Improvement
[
https://issues.apache.org/jira/browse/SPARK-32120?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17153579#comment-17153579
]
wuyi commented on SPARK-32120:
--
oh yeah, we've documented it. Thanks for reminding.
> Sing
[
https://issues.apache.org/jira/browse/SPARK-32120?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17151555#comment-17151555
]
wuyi commented on SPARK-32120:
--
We've removed coordination logic in SPARK-30969. It should
[
https://issues.apache.org/jira/browse/SPARK-32120?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17151550#comment-17151550
]
wuyi commented on SPARK-32120:
--
[~EnricoMi] Thanks for reporting! I am looking into this.
[
https://issues.apache.org/jira/browse/SPARK-32154?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32154:
-
Summary: Use ExpressionEncoder for the return type of ScalaUDF to convert
to catalyst type (was: Use Expression
[
https://issues.apache.org/jira/browse/SPARK-32154?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32154:
-
Summary: Use ExpressionEncoder for the return type of ScalaUDF to serialize
to catalyst type (was: Use Expressi
wuyi created SPARK-32154:
Summary: Use ExpressionEncoder to serialize to catalyst type for
the return type of ScalaUDF
Key: SPARK-32154
URL: https://issues.apache.org/jira/browse/SPARK-32154
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-32091?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32091:
-
Description:
When removing blocks(e.g. RDD, broadcast, shuffle), BlockManagerMaserEndpoint
will make RPC calls
wuyi created SPARK-32091:
Summary: Ignore timeout error when remove blocks on the lost
executor
Key: SPARK-32091
URL: https://issues.apache.org/jira/browse/SPARK-32091
Project: Spark
Issue Type: Imp
wuyi created SPARK-32090:
Summary: UserDefinedType.equal() does not have symmetry
Key: SPARK-32090
URL: https://issues.apache.org/jira/browse/SPARK-32090
Project: Spark
Issue Type: Improvement
[
https://issues.apache.org/jira/browse/SPARK-32087?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-32087:
-
Summary: Allow UserDefinedType to use encoder to deserialize rows in
ScalaUDF as well (was: Allow UserDenfinedT
wuyi created SPARK-32087:
Summary: Allow UserDenfinedType to use encoder to deserialize rows
in ScalaUDF as well
Key: SPARK-32087
URL: https://issues.apache.org/jira/browse/SPARK-32087
Project: Spark
wuyi created SPARK-32077:
Summary: Support host-local shuffle data reading with external
shuffle service disabled
Key: SPARK-32077
URL: https://issues.apache.org/jira/browse/SPARK-32077
Project: Spark
wuyi created SPARK-32055:
Summary: Unify getReader and getReaderForRange in ShuffleManager
Key: SPARK-32055
URL: https://issues.apache.org/jira/browse/SPARK-32055
Project: Spark
Issue Type: Improveme
[
https://issues.apache.org/jira/browse/SPARK-32037?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17141742#comment-17141742
]
wuyi commented on SPARK-32037:
--
+1 for healthy/unhealthy
> Rename blacklisting feature to
wuyi created SPARK-32031:
Summary: Fix the wrong references of PartialMerge/Final
AggregateExpression
Key: SPARK-32031
URL: https://issues.apache.org/jira/browse/SPARK-32031
Project: Spark
Issue Typ
[
https://issues.apache.org/jira/browse/SPARK-31970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31970:
-
Summary: Make MDC configuration step be consistent between setLocalProperty
and log4j.properties (was: Make MDC
wuyi created SPARK-31970:
Summary: Make MDC configuration step be consistent between
setLocalProperty and log4j
Key: SPARK-31970
URL: https://issues.apache.org/jira/browse/SPARK-31970
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-31946?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17130015#comment-17130015
]
wuyi commented on SPARK-31946:
--
oh, I mean, for the **decommission** feature, we should reg
[
https://issues.apache.org/jira/browse/SPARK-31946?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17130015#comment-17130015
]
wuyi edited comment on SPARK-31946 at 6/10/20, 3:27 AM:
oh, I me
[
https://issues.apache.org/jira/browse/SPARK-31946?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17129982#comment-17129982
]
wuyi commented on SPARK-31946:
--
The message is good. But the failure should not be expected
[
https://issues.apache.org/jira/browse/SPARK-31946?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31946:
-
Description:
{code:java}
20/06/09 22:54:54 WARN SignalUtils: Failed to register SIGPWR handler -
disabling de
[
https://issues.apache.org/jira/browse/SPARK-31946?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17129947#comment-17129947
]
wuyi commented on SPARK-31946:
--
cc [~holden]
> Failed to register SIGPWR handler on MacOS
wuyi created SPARK-31946:
Summary: Failed to register SIGPWR handler on MacOS
Key: SPARK-31946
URL: https://issues.apache.org/jira/browse/SPARK-31946
Project: Spark
Issue Type: Sub-task
Com
[
https://issues.apache.org/jira/browse/SPARK-31922?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31922:
-
Summary: "RpcEnv already stopped" error when exit spark-shell with
local-cluster mode (was: "RpcEnv already sto
[
https://issues.apache.org/jira/browse/SPARK-31922?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31922:
-
Summary: "RpcEnv already stopped" Error when exit spark-shell with
local-cluster mode (was: TransportRequestHan
[
https://issues.apache.org/jira/browse/SPARK-31922?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31922:
-
Affects Version/s: 2.4.6
> TransportRequestHandler Error when exit spark-shell with local-cluster mode
> ---
wuyi created SPARK-31922:
Summary: TransportRequestHandler Error when exit spark-shell with
local-cluster mode
Key: SPARK-31922
URL: https://issues.apache.org/jira/browse/SPARK-31922
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-31922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17127370#comment-17127370
]
wuyi commented on SPARK-31922:
--
I am working on this.
> TransportRequestHandler Error when
[
https://issues.apache.org/jira/browse/SPARK-31921?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31921:
-
Description:
When starting spark-shell using local cluster mode, e.g. ./bin/spark-shell
--master "local-cluster
wuyi created SPARK-31921:
Summary: Wrong warning of "WARN Master: App app-xxx requires more
resource than any of Workers could have."
Key: SPARK-31921
URL: https://issues.apache.org/jira/browse/SPARK-31921
Pr
wuyi created SPARK-31837:
Summary: Shift to the new highest locality level if there is when
recomputeLocality
Key: SPARK-31837
URL: https://issues.apache.org/jira/browse/SPARK-31837
Project: Spark
I
wuyi created SPARK-31826:
Summary: Support composed type of case class for typed Scala UDF
Key: SPARK-31826
URL: https://issues.apache.org/jira/browse/SPARK-31826
Project: Spark
Issue Type: Improveme
[
https://issues.apache.org/jira/browse/SPARK-31784?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31784:
-
Description:
{code:java}
test("share messages with allGather() call") {
val conf = new SparkConf()
.se
[
https://issues.apache.org/jira/browse/SPARK-31784?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31784:
-
Description:
{code:java}
test("share messages with allGather() call") {
val conf = new SparkConf()
.se
[
https://issues.apache.org/jira/browse/SPARK-31784?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31784:
-
Description:
{code:java}
test("share messages with allGather() call") {
val conf = new SparkConf()
.se
[
https://issues.apache.org/jira/browse/SPARK-31784?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31784:
-
Description:
{code:java}
test("share messages with allGather() call") {
val conf = new SparkConf()
.se
[
https://issues.apache.org/jira/browse/SPARK-31784?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31784:
-
Environment:
{code:java}
{code}
was:
{code:java}
test("share messages with allGather() call") {
va
wuyi created SPARK-31784:
Summary: Fix test BarrierTaskContextSuite."share messages with
allGather() call"
Key: SPARK-31784
URL: https://issues.apache.org/jira/browse/SPARK-31784
Project: Spark
Issu
[
https://issues.apache.org/jira/browse/SPARK-31750?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31750:
-
Summary: Eliminate UpCast if child's dataType is DecimalType (was:
Eliminate UpCast if chid's dataType is Decim
wuyi created SPARK-31750:
Summary: Eliminate UpCast if chid's dataType is DecimalType
Key: SPARK-31750
URL: https://issues.apache.org/jira/browse/SPARK-31750
Project: Spark
Issue Type: Improvement
[
https://issues.apache.org/jira/browse/SPARK-31682?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31682:
-
Parent: (was: SPARK-30098)
Issue Type: Improvement (was: Sub-task)
> Turn on spark.sql.legacy.crea
[
https://issues.apache.org/jira/browse/SPARK-31682?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31682:
-
Parent: SPARK-31085
Issue Type: Sub-task (was: Improvement)
> Turn on spark.sql.legacy.createHiveTable
wuyi created SPARK-31682:
Summary: Turn on
spark.sql.legacy.createHiveTableByDefault.enabled by default
Key: SPARK-31682
URL: https://issues.apache.org/jira/browse/SPARK-31682
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-31620?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17104008#comment-17104008
]
wuyi commented on SPARK-31620:
--
I'm working on this now.
> TreeNodeException: Binding attr
[
https://issues.apache.org/jira/browse/SPARK-20628?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17100793#comment-17100793
]
wuyi commented on SPARK-20628:
--
Hi [~holden] is this ticket resolved by
[https://github.co
[
https://issues.apache.org/jira/browse/SPARK-31651?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31651:
-
Summary: Improve handling the case where different barrier sync types in a
single sync (was: Improve handling f
[
https://issues.apache.org/jira/browse/SPARK-31651?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31651:
-
Description:
Currently, we use cleanupBarrierStage when detecting different barrier sync
types in a single sync
wuyi created SPARK-31651:
Summary: Improve handling for the case of different barrier sync
types in a single sync
Key: SPARK-31651
URL: https://issues.apache.org/jira/browse/SPARK-31651
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-31650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31650:
-
Issue Type: Bug (was: Test)
> SQL UI doesn't show metrics and whole stage codegen in AQE
>
wuyi created SPARK-31650:
Summary: SQL UI doesn't show metrics and whole stage codegen in AQE
Key: SPARK-31650
URL: https://issues.apache.org/jira/browse/SPARK-31650
Project: Spark
Issue Type: Test
[
https://issues.apache.org/jira/browse/SPARK-31650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31650:
-
Attachment: before_aqe_ui.png
> SQL UI doesn't show metrics and whole stage codegen in AQE
> ---
[
https://issues.apache.org/jira/browse/SPARK-31643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31643:
-
Description:
[https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/122273/testReport/org.apache.sp
wuyi created SPARK-31643:
Summary: Flaky o.a.s.scheduler.BarrierTaskContextSuite.barrier
task killed, interrupt
Key: SPARK-31643
URL: https://issues.apache.org/jira/browse/SPARK-31643
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-31572?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi resolved SPARK-31572.
--
Resolution: Won't Fix
we shall use debug level to improve logs.
> Improve task logs at executor side
> --
wuyi created SPARK-31572:
Summary: Improve task logs at executor side
Key: SPARK-31572
URL: https://issues.apache.org/jira/browse/SPARK-31572
Project: Spark
Issue Type: Improvement
Componen
[
https://issues.apache.org/jira/browse/SPARK-31529?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31529:
-
Description: The formatted explain included extra whitespaces. And even the
number of spaces are different betwe
[
https://issues.apache.org/jira/browse/SPARK-31529?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31529:
-
Summary: Remove extra whitespaces in the formatted explain (was: Remove
redundant whitespaces in the formatted
wuyi created SPARK-31529:
Summary: Remove redundant whitespaces in the formatted explain
Key: SPARK-31529
URL: https://issues.apache.org/jira/browse/SPARK-31529
Project: Spark
Issue Type: Improvement
wuyi created SPARK-31521:
Summary: The fetch size is not correct when merging blocks into a
merged block
Key: SPARK-31521
URL: https://issues.apache.org/jira/browse/SPARK-31521
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-31509?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31509:
-
Affects Version/s: (was: 3.1.0)
2.4.0
> Recommend user to disable delay scheduling fo
[
https://issues.apache.org/jira/browse/SPARK-31509?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31509:
-
Description:
Currently, barrier taskset can only be scheduled when all tasks are launched at
the same time. As
[
https://issues.apache.org/jira/browse/SPARK-31509?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31509:
-
Description:
Currently, barrier taskset can only be scheduled when all tasks are launched at
the same time. As
[
https://issues.apache.org/jira/browse/SPARK-31509?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wuyi updated SPARK-31509:
-
Description:
Currently, barrier taskset can only be scheduled when all tasks are launched at
same time. As a re
wuyi created SPARK-31509:
Summary: Recommend user to disable delay scheduling for barrier
taskset
Key: SPARK-31509
URL: https://issues.apache.org/jira/browse/SPARK-31509
Project: Spark
Issue Type: I
wuyi created SPARK-31504:
Summary: Output fields in formatted Explain should have determined
order.
Key: SPARK-31504
URL: https://issues.apache.org/jira/browse/SPARK-31504
Project: Spark
Issue Type:
wuyi created SPARK-31495:
Summary: Support formatted explain for Adaptive Query Execution
Key: SPARK-31495
URL: https://issues.apache.org/jira/browse/SPARK-31495
Project: Spark
Issue Type: Improvemen
wuyi created SPARK-31487:
Summary: Move slots check of barrier job from DAGScheduler to
TaskSchedulerImpl
Key: SPARK-31487
URL: https://issues.apache.org/jira/browse/SPARK-31487
Project: Spark
Issue
wuyi created SPARK-31485:
Summary: Barrier stage can hang if only partial tasks launched
Key: SPARK-31485
URL: https://issues.apache.org/jira/browse/SPARK-31485
Project: Spark
Issue Type: Bug
201 - 300 of 441 matches
Mail list logo