reviews
Thread
Date
Earlier messages
Later messages
Messages by Thread
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
[PR] [SPARK-56409] Consolidate `Netty` exclusions into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56409] Consolidate `Netty` exclusions into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56409] Consolidate `Netty` exclusions into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56409] Consolidate `Netty` exclusions into root `build.gradle` [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-55276][DOCS] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS][SDP] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS][SDP] Document how SDP datasets are stored and refreshed [spark]
via GitHub
[I] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [I] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [I] Document how SDP datasets are stored and refreshed [spark]
via GitHub
[PR] [WIP][SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
[PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
[PR] [SQL] Add ParquetFormatVersion enum and writer version option [spark]
via GitHub
Re: [PR] [SQL] Add ParquetFormatVersion enum and writer version option [spark]
via GitHub
[PR] [SPARK-56408] Introduce `OwnerResourceDecorator` and use it instead `(Driver|Cluster)Decorator` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56408] Introduce `OwnerResourceDecorator` and use it instead `(Driver|Cluster)Decorator` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56408] Introduce `OwnerResourceDecorator` and use it instead `(Driver|Cluster)Decorator` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56408] Introduce `OwnerResourceDecorator` and use it instead `(Driver|Cluster)Decorator` [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
[PR] [SPARK-56406][SS] Stream-stream join v4: skip writing secondary index if the operator will not evict from that side [spark]
via GitHub
Re: [PR] [SPARK-56406][SS] Stream-stream join v4: skip writing secondary index if the operator will not evict from that side [spark]
via GitHub
Re: [PR] [SPARK-56406][SS] Stream-stream join v4: skip writing secondary index if the operator will not evict from that side [spark]
via GitHub
Re: [PR] [SPARK-56406][SS] Stream-stream join v4: skip writing secondary index if the operator will not evict from that side [spark]
via GitHub
Re: [PR] [SPARK-56406][SS] Stream-stream join v4: skip writing secondary index if the operator will not evict from that side [spark]
via GitHub
[PR] [SPARK-XXXXX][PYTHON][TESTS] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
[PR] [SPARK-55910][SQL][TESTS] Merge `SQLTestUtils` into `QueryTest` [spark]
via GitHub
Re: [PR] [SPARK-55748][SQL] Use `DSv2` for `avro|csv|json|kafka|orc|parquet|text` by default [spark]
via GitHub
Re: [PR] [SPARK-55324][PYTHON] Make convert_numpy support ArrayType [spark]
via GitHub
Re: [PR] [SPARK-55324][PYTHON] Make convert_numpy support ArrayType [spark]
via GitHub
[PR] update [spark]
via GitHub
Re: [PR] [SPARK-56019][SQL] Close JDBC connection on task kill to unblock native socket reads [spark]
via GitHub
Re: [PR] [SPARK-56019][SQL] Close JDBC connection on task kill to unblock native socket reads [spark]
via GitHub
Re: [PR] [SPARK-56019][SQL] Close JDBC connection on task kill to unblock native socket reads [spark]
via GitHub
Re: [PR] [SPARK-56019][SQL] Close JDBC connection on task kill to unblock native socket reads [spark]
via GitHub
[PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
[PR] [SPARK-56396] Support `dependencyUpdates` Gradle task [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56396] Support `dependencyUpdates` Gradle task [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56396] Support `dependencyUpdates` Gradle task [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56396] Support `dependencyUpdates` Gradle task [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56396] Support `dependencyUpdates` Gradle task [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-56405] Enable `Gradle` configuration cache [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56405] Enable `Gradle` configuration cache [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56405] Enable `Gradle` configuration cache [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56405] Enable `Gradle` configuration cache [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-33737][K8S] Support getting pod state using Informers + Listers [spark]
via GitHub
Re: [PR] [SPARK-33737][K8S] Support getting pod state using Informers + Listers [spark]
via GitHub
[PR] [SPARK-56401] Use `setup-gradle` GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` v6.1.0 GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` v6.1.0 GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` v6.1.0 GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` v6.1.0 GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` v6.1.0 GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56401] Use `setup-gradle` v6.1.0 GitHub Actions in CIs [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
Re: [PR] [SPARK-56403] Refactor kafka test so it's skipped when dependency is not available [spark]
via GitHub
[PR] [SPARK-56400][SS] Apply rangeScan API in transformWithState Timer/TTL [spark]
via GitHub
Re: [PR] [SPARK-56400][SS] Apply rangeScan API in transformWithState Timer/TTL [spark]
via GitHub
Re: [PR] [SPARK-56400][SS] Apply rangeScan API in transformWithState Timer/TTL [spark]
via GitHub
Re: [PR] [SPARK-56400][SS] Apply rangeScan API in transformWithState Timer/TTL [spark]
via GitHub
Re: [PR] [SPARK-56400][SS] Apply rangeScan API in transformWithState Timer/TTL [spark]
via GitHub
[PR] [SPARK-56394] Upgrade the minimum K8s version to v1.34 [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56394] Upgrade the minimum K8s version to v1.34 [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56394] Upgrade the minimum K8s version to v1.34 [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56394] Upgrade the minimum K8s version to v1.34 [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56394] Upgrade the minimum K8s version to v1.34 [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-56397] Upgrade `ICU4J` to 78.3 [spark]
via GitHub
Re: [PR] [SPARK-56397][BUILD] Upgrade `ICU4J` to 78.3 [spark]
via GitHub
Re: [PR] [SPARK-56397][BUILD] Upgrade `ICU4J` to 78.3 [spark]
via GitHub
[PR] [SPARK-56393] Drop K8s v1.33 Support [spark]
via GitHub
Re: [PR] [SPARK-56393][K8S][DOCS] Drop K8s v1.33 Support [spark]
via GitHub
Re: [PR] [SPARK-56393][K8S][DOCS] Drop K8s v1.33 Support [spark]
via GitHub
Re: [PR] [SPARK-56393][K8S][DOCS] Drop K8s v1.33 Support [spark]
via GitHub
Re: [PR] [SPARK-56393][K8S][DOCS] Drop K8s v1.33 Support [spark]
via GitHub
[PR] [SPARK-56388][CONNECT] Add XML support to Spark Connect Parse protocol [spark]
via GitHub
[PR] [SPARK-56392][SQL] Make Sample.seed Optional to distinguish user-specified vs random seeds [spark]
via GitHub
Re: [PR] [SPARK-56392][SQL] Make Sample.seed Optional to distinguish user-specified vs random seeds [spark]
via GitHub
Re: [PR] [SPARK-56392][SQL] Make Sample.seed Optional to distinguish user-specified vs random seeds [spark]
via GitHub
[PR] Implement `SET` command in single-pass analyzer [spark]
via GitHub
Re: [PR] [SPARK-56147][SQL] `spark-sql` cli correctly handles SQL Scripting compound blocks [spark]
via GitHub
Re: [PR] [SPARK-56147][SQL] `spark-sql` cli correctly handles SQL Scripting compound blocks [spark]
via GitHub
Re: [PR] [SPARK-56147][SQL] `spark-sql` cli correctly handles SQL Scripting compound blocks [spark]
via GitHub
Re: [PR] [SPARK-56147][SQL] `spark-sql` cli correctly handles SQL Scripting compound blocks [spark]
via GitHub
Re: [PR] [SPARK-56147][SQL] `spark-sql` cli correctly handles SQL Scripting compound blocks [spark]
via GitHub
[PR] [SPARK-56391] [SQL] Support unwrap string type to date type in UnwrapCastInBinaryComparison [spark]
via GitHub
[PR] Bump addressable from 2.8.7 to 2.9.0 in /docs [spark]
via GitHub
Re: [PR] Bump addressable from 2.8.7 to 2.9.0 in /docs [spark]
via GitHub
Re: [PR] Bump addressable from 2.8.7 to 2.9.0 in /docs [spark]
via GitHub
[I] Spark 4.0.0 Hadoop 3.4.1 client JARs incompatible with Hive Metastore 3.1.x [spark]
via GitHub
Re: [I] Spark 4.0.0 Hadoop 3.4.1 client JARs incompatible with Hive Metastore 3.1.x [spark]
via GitHub
Re: [I] Spark 4.0.0 Hadoop 3.4.1 client JARs incompatible with Hive Metastore 3.1.x [spark]
via GitHub
Re: [PR] [SPARK-54651][CORE] Driver should delete temporary files when exiting during commit process [spark]
via GitHub
Re: [PR] [SPARK-54774][CORE] Submit failed should keep same exit code with app exit code in K8s mode [spark]
via GitHub
Re: [PR] [SPARK-54774][CORE] Submit failed should keep same exit code with app exit code in K8s mode [spark]
via GitHub
Re: [PR] [SPARK-54774][CORE] Submit failed should keep same exit code with app exit code in K8s mode [spark]
via GitHub
Re: [PR] [SPARK-54774][CORE] Submit failed should keep same exit code with app exit code in K8s mode [spark]
via GitHub
[PR] [SPARK-56385][SQL] Track pushed filter expressions on DataSourceV2ScanRelation [spark]
via GitHub
Re: [PR] [SPARK-56385][SQL] Track pushed filter expressions on DataSourceV2ScanRelation [spark]
via GitHub
Re: [PR] [SPARK-56385][SQL] Track pushed filter expressions on DataSourceV2ScanRelation [spark]
via GitHub
Re: [PR] [SPARK-56385][SQL] Track pushed filter expressions on DataSourceV2ScanRelation [spark]
via GitHub
Re: [PR] [SPARK-56385][SQL] Track pushed filter expressions on DataSourceV2ScanRelation [spark]
via GitHub
Re: [PR] [SPARK-56385][SQL] Track pushed filter expressions on DataSourceV2ScanRelation [spark]
via GitHub
Earlier messages
Later messages