[
https://issues.apache.org/jira/browse/SPARK-42529?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-42529:
-
Summary: Support Cube and Rollup (was: Support Cube,Rollup,Pivot)
> Support Cube and Rollup
>
Rui Wang created SPARK-42529:
Summary: Support Cube,Rollup,Pivot
Key: SPARK-42529
URL: https://issues.apache.org/jira/browse/SPARK-42529
Project: Spark
Issue Type: Sub-task
Components:
Rui Wang created SPARK-42520:
Summary: Spark Connect Scala Client: Window
Key: SPARK-42520
URL: https://issues.apache.org/jira/browse/SPARK-42520
Project: Spark
Issue Type: Task
[
https://issues.apache.org/jira/browse/SPARK-42467?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17691264#comment-17691264
]
Rui Wang commented on SPARK-42467:
--
Yes we gonna need to support cube/rollup/groupingsets along with
Rui Wang created SPARK-42481:
Summary: Implement agg.{max,min,mean,count,avg,sum}
Key: SPARK-42481
URL: https://issues.apache.org/jira/browse/SPARK-42481
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-42468:
Summary: Implement agg by (String, String)*
Key: SPARK-42468
URL: https://issues.apache.org/jira/browse/SPARK-42468
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-42467:
Summary: Spark Connect Scala Client: GroupBy and Aggregation
Key: SPARK-42467
URL: https://issues.apache.org/jira/browse/SPARK-42467
Project: Spark
Issue Type: Task
Rui Wang created SPARK-42453:
Summary: Implement function max in Scala client
Key: SPARK-42453
URL: https://issues.apache.org/jira/browse/SPARK-42453
Project: Spark
Issue Type: Task
Rui Wang created SPARK-42443:
Summary: Remove unused object in DataFrameAggregateSuite
Key: SPARK-42443
URL: https://issues.apache.org/jira/browse/SPARK-42443
Project: Spark
Issue Type: Task
[
https://issues.apache.org/jira/browse/SPARK-41918?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=1764#comment-1764
]
Rui Wang commented on SPARK-41918:
--
I did some tests locally and find something as the below:
If I
[
https://issues.apache.org/jira/browse/SPARK-41918?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17655545#comment-17655545
]
Rui Wang edited comment on SPARK-41918 at 1/6/23 6:35 PM:
--
[~grundprinzip-db]
[
https://issues.apache.org/jira/browse/SPARK-41918?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17655545#comment-17655545
]
Rui Wang commented on SPARK-41918:
--
[~grundprinzip-db]
I am a bit confused on the renaming and what
[
https://issues.apache.org/jira/browse/SPARK-41911?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17655537#comment-17655537
]
Rui Wang commented on SPARK-41911:
--
We will know better where we need the versions during the process
Rui Wang created SPARK-41912:
Summary: Subquery should not validate CTE
Key: SPARK-41912
URL: https://issues.apache.org/jira/browse/SPARK-41912
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-41755?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41755:
-
Summary: Reorder fields to use consecutive field numbers (was: Reorder the
relation IDs)
> Reorder
[
https://issues.apache.org/jira/browse/SPARK-41909?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang resolved SPARK-41909.
--
Resolution: Duplicate
https://issues.apache.org/jira/browse/SPARK-41755
> Update proto fields to use
[
https://issues.apache.org/jira/browse/SPARK-41908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41908:
-
Description: We may revisit Catalog proto design and refactor it such that
it becomes a breaking
[
https://issues.apache.org/jira/browse/SPARK-41910?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41910:
-
Description: Every field in proto3 has a default value. We should revisit
existing proto field to
[
https://issues.apache.org/jira/browse/SPARK-41911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41911:
-
Description: We may need this to help maintain compatibility. Depending on
the concrete protocol
Rui Wang created SPARK-41911:
Summary: Add version fields to Connect proto
Key: SPARK-41911
URL: https://issues.apache.org/jira/browse/SPARK-41911
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41910:
Summary: Remove `optional` notation in proto
Key: SPARK-41910
URL: https://issues.apache.org/jira/browse/SPARK-41910
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41909:
Summary: Update proto fields to use increasing field numbers and
avoid holes
Key: SPARK-41909
URL: https://issues.apache.org/jira/browse/SPARK-41909
Project: Spark
Rui Wang created SPARK-41908:
Summary: Catalog API refactoring
Key: SPARK-41908
URL: https://issues.apache.org/jira/browse/SPARK-41908
Project: Spark
Issue Type: Sub-task
Components:
Rui Wang created SPARK-41691:
Summary: Implement row count based shuffle read for global limit
Key: SPARK-41691
URL: https://issues.apache.org/jira/browse/SPARK-41691
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-41537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41537:
Assignee: Rui Wang
> Protobuf backwards compatibility testing
>
[
https://issues.apache.org/jira/browse/SPARK-41512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41512:
-
Description:
h3. Problem Statement
In current Spark optimizer, a single partition shuffle might be
[
https://issues.apache.org/jira/browse/SPARK-41512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41512:
-
Description:
h3. Problem Statement
In current Spark optimizer, a single partition shuffle might be
[
https://issues.apache.org/jira/browse/SPARK-41512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41512:
-
Description:
h3. Problem Statement
In current Spark optimizer, a single partition shuffle might be
Rui Wang created SPARK-41513:
Summary: Implement a Accumulator to collect per mapper row count
metrics
Key: SPARK-41513
URL: https://issues.apache.org/jira/browse/SPARK-41513
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-41512?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17646898#comment-17646898
]
Rui Wang commented on SPARK-41512:
--
cc [~cloud_fan]
> Row count based shuffle read to optimize global
Rui Wang created SPARK-41512:
Summary: Row count based shuffle read to optimize global limit
after a single partition shuffle (optionally with input partition sorted)
Key: SPARK-41512
URL:
Rui Wang created SPARK-41444:
Summary: Support read.json
Key: SPARK-41444
URL: https://issues.apache.org/jira/browse/SPARK-41444
Project: Spark
Issue Type: Sub-task
Components: Connect
[
https://issues.apache.org/jira/browse/SPARK-41349?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17644463#comment-17644463
]
Rui Wang commented on SPARK-41349:
--
Keeping this issue as open given that there is python side of work
[
https://issues.apache.org/jira/browse/SPARK-41349?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reopened SPARK-41349:
--
> Implement `DataFrame.hint`
> --
>
> Key: SPARK-41349
>
Rui Wang created SPARK-41412:
Summary: Implement `Cast`
Key: SPARK-41412
URL: https://issues.apache.org/jira/browse/SPARK-41412
Project: Spark
Issue Type: Sub-task
Components: Connect
Rui Wang created SPARK-41399:
Summary: Refactor column related tests to test_connect_column
Key: SPARK-41399
URL: https://issues.apache.org/jira/browse/SPARK-41399
Project: Spark
Issue Type:
Rui Wang created SPARK-41384:
Summary: Should use SQLExpression for str arguments in Projection
Key: SPARK-41384
URL: https://issues.apache.org/jira/browse/SPARK-41384
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-41305?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reopened SPARK-41305:
--
> Connect Proto Completeness
> --
>
> Key: SPARK-41305
>
[
https://issues.apache.org/jira/browse/SPARK-41371?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17643098#comment-17643098
]
Rui Wang commented on SPARK-41371:
--
https://github.com/apache/spark/pull/38890
> Improve Documentation
[
https://issues.apache.org/jira/browse/SPARK-41371?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang resolved SPARK-41371.
--
Fix Version/s: 3.4.0
Resolution: Fixed
> Improve Documentation for Command proto
>
Rui Wang created SPARK-41373:
Summary: Rename CAST_WITH_FUN_SUGGESTION to
CAST_WITH_FUNC_SUGGESTION
Key: SPARK-41373
URL: https://issues.apache.org/jira/browse/SPARK-41373
Project: Spark
Issue
Rui Wang created SPARK-41372:
Summary: Support DataFrame TempView
Key: SPARK-41372
URL: https://issues.apache.org/jira/browse/SPARK-41372
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41371:
Summary: Improve Documentation for Command proto
Key: SPARK-41371
URL: https://issues.apache.org/jira/browse/SPARK-41371
Project: Spark
Issue Type: Sub-task
[
https://issues.apache.org/jira/browse/SPARK-41349?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17642185#comment-17642185
]
Rui Wang commented on SPARK-41349:
--
cc [~dengziming] if you are interested in
> Implement
[
https://issues.apache.org/jira/browse/SPARK-40970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-40970:
-
Summary: Support List[Column] for Join's on argument. (was: Support
List[ColumnRef] for Join's on
[
https://issues.apache.org/jira/browse/SPARK-40586?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17642120#comment-17642120
]
Rui Wang commented on SPARK-40586:
--
Contributions are welcome!
> Decouple plan transformation and
[
https://issues.apache.org/jira/browse/SPARK-40586?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-40586:
Assignee: (was: Rui Wang)
> Decouple plan transformation and validation on server side
>
[
https://issues.apache.org/jira/browse/SPARK-41352?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang deleted SPARK-41352:
-
> Support DataFrame.hint
> --
>
> Key: SPARK-41352
>
[
https://issues.apache.org/jira/browse/SPARK-41352?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41352:
-
Description:
We have hint in proto now:
[
https://issues.apache.org/jira/browse/SPARK-41352?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17642118#comment-17642118
]
Rui Wang commented on SPARK-41352:
--
This JIRA is open to pick up.
> Support DataFrame.hint
>
Rui Wang created SPARK-41352:
Summary: Support DataFrame.hint
Key: SPARK-41352
URL: https://issues.apache.org/jira/browse/SPARK-41352
Project: Spark
Issue Type: Sub-task
Components:
[
https://issues.apache.org/jira/browse/SPARK-41347?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41347:
-
Priority: Critical (was: Major)
> Add Cast to Expression proto
>
>
>
Rui Wang created SPARK-41347:
Summary: Add Cast to Expression proto
Key: SPARK-41347
URL: https://issues.apache.org/jira/browse/SPARK-41347
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41345:
Summary: Add Hint to Connect Proto
Key: SPARK-41345
URL: https://issues.apache.org/jira/browse/SPARK-41345
Project: Spark
Issue Type: Sub-task
Components:
Rui Wang created SPARK-41343:
Summary: Move FunctionName parsing to server side
Key: SPARK-41343
URL: https://issues.apache.org/jira/browse/SPARK-41343
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41335:
Summary: Support IsNull and IsNotNull in Column
Key: SPARK-41335
URL: https://issues.apache.org/jira/browse/SPARK-41335
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41330:
Summary: Improve Documentation for Take,Tail, Limit and Offset
Key: SPARK-41330
URL: https://issues.apache.org/jira/browse/SPARK-41330
Project: Spark
Issue Type:
Rui Wang created SPARK-41329:
Summary: Solve circular import between Column and _typing/functions
Key: SPARK-41329
URL: https://issues.apache.org/jira/browse/SPARK-41329
Project: Spark
Issue
Rui Wang created SPARK-41328:
Summary: Add logical and string API to Column
Key: SPARK-41328
URL: https://issues.apache.org/jira/browse/SPARK-41328
Project: Spark
Issue Type: Sub-task
[
https://issues.apache.org/jira/browse/SPARK-41312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41312:
-
Priority: Critical (was: Major)
> Implement DataFrame.withColumnRenamed
>
Rui Wang created SPARK-41312:
Summary: Implement DataFrame.withColumnRenamed
Key: SPARK-41312
URL: https://issues.apache.org/jira/browse/SPARK-41312
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41310:
Summary: Implement DataFrame.toDF
Key: SPARK-41310
URL: https://issues.apache.org/jira/browse/SPARK-41310
Project: Spark
Issue Type: Sub-task
Components:
Rui Wang created SPARK-41308:
Summary: Improve `DataFrame.count()`
Key: SPARK-41308
URL: https://issues.apache.org/jira/browse/SPARK-41308
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41307:
Summary: Improve Connect Expression proto documentation
Key: SPARK-41307
URL: https://issues.apache.org/jira/browse/SPARK-41307
Project: Spark
Issue Type: Sub-task
[
https://issues.apache.org/jira/browse/SPARK-41307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang deleted SPARK-41307:
-
> Improve Connect Expression proto documentation
> --
>
>
Rui Wang created SPARK-41306:
Summary: Improve Connect Expression proto documentation
Key: SPARK-41306
URL: https://issues.apache.org/jira/browse/SPARK-41306
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41305:
Summary: Connect Proto Completeness
Key: SPARK-41305
URL: https://issues.apache.org/jira/browse/SPARK-41305
Project: Spark
Issue Type: Umbrella
Rui Wang created SPARK-41304:
Summary: Add missing docs for DataFrame API
Key: SPARK-41304
URL: https://issues.apache.org/jira/browse/SPARK-41304
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41280:
Summary: Implement DataFrame.tail
Key: SPARK-41280
URL: https://issues.apache.org/jira/browse/SPARK-41280
Project: Spark
Issue Type: Sub-task
Components:
[
https://issues.apache.org/jira/browse/SPARK-41265?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang resolved SPARK-41265.
--
Fix Version/s: 3.4.0
Resolution: Fixed
> Check and upgrade
[
https://issues.apache.org/jira/browse/SPARK-41265?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41265:
Assignee: Bjørn Jørgensen
> Check and upgrade buf.build/protocolbuffers/plugins/python to 3.19.5
[
https://issues.apache.org/jira/browse/SPARK-41264?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41264:
Assignee: Ruifeng Zheng
> Make Literal support more datatypes
>
[
https://issues.apache.org/jira/browse/SPARK-41263?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41263:
Assignee: Ruifeng Zheng
> Upgrade buf to v1.9.0
> -
>
> Key:
[
https://issues.apache.org/jira/browse/SPARK-41227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41227:
Assignee: Xinrong Meng
> Implement `DataFrame.crossJoin`
> ---
>
>
[
https://issues.apache.org/jira/browse/SPARK-40970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-40970:
Assignee: Rui Wang
> Support List[ColumnRef] for Join's on argument.
>
[
https://issues.apache.org/jira/browse/SPARK-40586?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-40586:
Assignee: Rui Wang
> Decouple plan transformation and validation on server side
>
[
https://issues.apache.org/jira/browse/SPARK-41256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41256:
Assignee: Rui Wang
> Implement DataFrame.withColumn(s)
> -
>
>
[
https://issues.apache.org/jira/browse/SPARK-41268?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41268:
Assignee: Rui Wang
> Refactor "Column" for API Compatibility
>
[
https://issues.apache.org/jira/browse/SPARK-41270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41270:
Assignee: Rui Wang
> Add Catalog tableExists and namespaceExists in Connect proto
>
[
https://issues.apache.org/jira/browse/SPARK-41278?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang reassigned SPARK-41278:
Assignee: Rui Wang
> Clean up unused QualifiedAttribute in Expression.proto
>
Rui Wang created SPARK-41278:
Summary: Clean up unused QualifiedAttribute in Expression.proto
Key: SPARK-41278
URL: https://issues.apache.org/jira/browse/SPARK-41278
Project: Spark
Issue Type:
Rui Wang created SPARK-41270:
Summary: Add Catalog tableExists and namespaceExists in Connect
proto
Key: SPARK-41270
URL: https://issues.apache.org/jira/browse/SPARK-41270
Project: Spark
Issue
Rui Wang created SPARK-41268:
Summary: Refactor "Column" for API Compatibility
Key: SPARK-41268
URL: https://issues.apache.org/jira/browse/SPARK-41268
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41256:
Summary: Implement DataFrame.withColumn(s)
Key: SPARK-41256
URL: https://issues.apache.org/jira/browse/SPARK-41256
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41250:
Summary: DataFrame.to_pandas should not return optional pandas
dataframe
Key: SPARK-41250
URL: https://issues.apache.org/jira/browse/SPARK-41250
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-41230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41230:
-
Summary: Remove `str` from Aggregate expression type (was: Remove `str`
from Aggregate)
> Remove
[
https://issues.apache.org/jira/browse/SPARK-41230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41230:
-
Summary: Remove `str` from Aggregate (was: Remove `str` from Class
Aggregate in Plan.py)
> Remove
[
https://issues.apache.org/jira/browse/SPARK-41227?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17637593#comment-17637593
]
Rui Wang commented on SPARK-41227:
--
+1 to have this to match existing PySpark API.
> Implement
Rui Wang created SPARK-41230:
Summary: Remove `str` from Class Aggregate in Plan.py
Key: SPARK-41230
URL: https://issues.apache.org/jira/browse/SPARK-41230
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41201:
Summary: Implement `DataFrame.SelectExpr` in Python client
Key: SPARK-41201
URL: https://issues.apache.org/jira/browse/SPARK-41201
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-41115?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rui Wang updated SPARK-41115:
-
Summary: Add ClientType to proto to indicate which client sends a request
(was: Add ClientType Enum to
Rui Wang created SPARK-41164:
Summary: Update relations.proto to follow Connect Proto
development guidance
Key: SPARK-41164
URL: https://issues.apache.org/jira/browse/SPARK-41164
Project: Spark
Rui Wang created SPARK-41127:
Summary: Implement DataFrame.CreateGlobalView in Python client
Key: SPARK-41127
URL: https://issues.apache.org/jira/browse/SPARK-41127
Project: Spark
Issue Type:
Rui Wang created SPARK-41122:
Summary: Explain API can support different modes
Key: SPARK-41122
URL: https://issues.apache.org/jira/browse/SPARK-41122
Project: Spark
Issue Type: Sub-task
Rui Wang created SPARK-41116:
Summary: Input relation can be optional for Project in Connect
proto
Key: SPARK-41116
URL: https://issues.apache.org/jira/browse/SPARK-41116
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-41114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17632573#comment-17632573
]
Rui Wang commented on SPARK-41114:
--
cc [~ruifengz]
> Support local data for LocalRelation
>
Rui Wang created SPARK-41115:
Summary: Add ClientType Enum to proto to indicate which client
sends a request
Key: SPARK-41115
URL: https://issues.apache.org/jira/browse/SPARK-41115
Project: Spark
Rui Wang created SPARK-41110:
Summary: Implement `DataFrame.sparkSession` in Python client
Key: SPARK-41110
URL: https://issues.apache.org/jira/browse/SPARK-41110
Project: Spark
Issue Type:
Rui Wang created SPARK-41105:
Summary: Adopt `optional` keyword from proto3 which offers
`hasXXX` to differentiate if a field is set or unset
Key: SPARK-41105
URL: https://issues.apache.org/jira/browse/SPARK-41105
[
https://issues.apache.org/jira/browse/SPARK-41104?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17631946#comment-17631946
]
Rui Wang commented on SPARK-41104:
--
Looks like HIVE only enforce `NOT NULL` since Hive 3.0.0
101 - 200 of 309 matches
Mail list logo