[jira] [Resolved] (SPARK-41721) Enable doctests in pyspark.sql.connect.catalog
[ https://issues.apache.org/jira/browse/SPARK-41721?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon resolved SPARK-41721. -- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39224 [https://github.com/apache/spark/pull/39224] > Enable doctests in pyspark.sql.connect.catalog > -- > > Key: SPARK-41721 > URL: https://issues.apache.org/jira/browse/SPARK-41721 > Project: Spark > Issue Type: Sub-task > Components: Connect, Tests >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41721) Enable doctests in pyspark.sql.connect.catalog
[ https://issues.apache.org/jira/browse/SPARK-41721?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon reassigned SPARK-41721: Assignee: Hyukjin Kwon > Enable doctests in pyspark.sql.connect.catalog > -- > > Key: SPARK-41721 > URL: https://issues.apache.org/jira/browse/SPARK-41721 > Project: Spark > Issue Type: Sub-task > Components: Connect, Tests >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41717) Implement the command logic for print and _repr_html_
[ https://issues.apache.org/jira/browse/SPARK-41717?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652184#comment-17652184 ] Apache Spark commented on SPARK-41717: -- User 'HyukjinKwon' has created a pull request for this issue: https://github.com/apache/spark/pull/39237 > Implement the command logic for print and _repr_html_ > - > > Key: SPARK-41717 > URL: https://issues.apache.org/jira/browse/SPARK-41717 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > Fix For: 3.4.0 > > > We should deduplicate the implementation of print and _repr_html_ at > pyspark.sql.connect.plan.LogicalPlan -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41529) Implement SparkSession.stop
[ https://issues.apache.org/jira/browse/SPARK-41529?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng resolved SPARK-41529. --- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39231 [https://github.com/apache/spark/pull/39231] > Implement SparkSession.stop > --- > > Key: SPARK-41529 > URL: https://issues.apache.org/jira/browse/SPARK-41529 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41529) Implement SparkSession.stop
[ https://issues.apache.org/jira/browse/SPARK-41529?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng reassigned SPARK-41529: - Assignee: Hyukjin Kwon > Implement SparkSession.stop > --- > > Key: SPARK-41529 > URL: https://issues.apache.org/jira/browse/SPARK-41529 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41068) Implement `DataFrame.stat.corr`
[ https://issues.apache.org/jira/browse/SPARK-41068?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652181#comment-17652181 ] Apache Spark commented on SPARK-41068: -- User 'beliefer' has created a pull request for this issue: https://github.com/apache/spark/pull/39236 > Implement `DataFrame.stat.corr` > --- > > Key: SPARK-41068 > URL: https://issues.apache.org/jira/browse/SPARK-41068 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41068) Implement `DataFrame.stat.corr`
[ https://issues.apache.org/jira/browse/SPARK-41068?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652182#comment-17652182 ] Apache Spark commented on SPARK-41068: -- User 'beliefer' has created a pull request for this issue: https://github.com/apache/spark/pull/39236 > Implement `DataFrame.stat.corr` > --- > > Key: SPARK-41068 > URL: https://issues.apache.org/jira/browse/SPARK-41068 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41068) Implement `DataFrame.stat.corr`
[ https://issues.apache.org/jira/browse/SPARK-41068?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41068: Assignee: Apache Spark (was: Ruifeng Zheng) > Implement `DataFrame.stat.corr` > --- > > Key: SPARK-41068 > URL: https://issues.apache.org/jira/browse/SPARK-41068 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41068) Implement `DataFrame.stat.corr`
[ https://issues.apache.org/jira/browse/SPARK-41068?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41068: Assignee: Ruifeng Zheng (was: Apache Spark) > Implement `DataFrame.stat.corr` > --- > > Key: SPARK-41068 > URL: https://issues.apache.org/jira/browse/SPARK-41068 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41729) Assign name to _LEGACY_ERROR_TEMP_0011
[ https://issues.apache.org/jira/browse/SPARK-41729?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652177#comment-17652177 ] Apache Spark commented on SPARK-41729: -- User 'LuciferYang' has created a pull request for this issue: https://github.com/apache/spark/pull/39235 > Assign name to _LEGACY_ERROR_TEMP_0011 > --- > > Key: SPARK-41729 > URL: https://issues.apache.org/jira/browse/SPARK-41729 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Minor > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41729) Assign name to _LEGACY_ERROR_TEMP_0011
[ https://issues.apache.org/jira/browse/SPARK-41729?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41729: Assignee: Apache Spark > Assign name to _LEGACY_ERROR_TEMP_0011 > --- > > Key: SPARK-41729 > URL: https://issues.apache.org/jira/browse/SPARK-41729 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Apache Spark >Priority: Minor > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41729) Assign name to _LEGACY_ERROR_TEMP_0011
[ https://issues.apache.org/jira/browse/SPARK-41729?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41729: Assignee: (was: Apache Spark) > Assign name to _LEGACY_ERROR_TEMP_0011 > --- > > Key: SPARK-41729 > URL: https://issues.apache.org/jira/browse/SPARK-41729 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Minor > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-41729) Assign name to _LEGACY_ERROR_TEMP_0011
[ https://issues.apache.org/jira/browse/SPARK-41729?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-41729: - Parent: SPARK-37935 Issue Type: Sub-task (was: Improvement) > Assign name to _LEGACY_ERROR_TEMP_0011 > --- > > Key: SPARK-41729 > URL: https://issues.apache.org/jira/browse/SPARK-41729 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Minor > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41068) Implement `DataFrame.stat.corr`
[ https://issues.apache.org/jira/browse/SPARK-41068?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652173#comment-17652173 ] jiaan.geng commented on SPARK-41068: I will try. > Implement `DataFrame.stat.corr` > --- > > Key: SPARK-41068 > URL: https://issues.apache.org/jira/browse/SPARK-41068 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41729) Assign name to _LEGACY_ERROR_TEMP_0011
Yang Jie created SPARK-41729: Summary: Assign name to _LEGACY_ERROR_TEMP_0011 Key: SPARK-41729 URL: https://issues.apache.org/jira/browse/SPARK-41729 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 3.4.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41728) Implement `unwrap_udt` function
[ https://issues.apache.org/jira/browse/SPARK-41728?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41728: Assignee: Apache Spark > Implement `unwrap_udt` function > --- > > Key: SPARK-41728 > URL: https://issues.apache.org/jira/browse/SPARK-41728 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41728) Implement `unwrap_udt` function
[ https://issues.apache.org/jira/browse/SPARK-41728?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652163#comment-17652163 ] Apache Spark commented on SPARK-41728: -- User 'zhengruifeng' has created a pull request for this issue: https://github.com/apache/spark/pull/39234 > Implement `unwrap_udt` function > --- > > Key: SPARK-41728 > URL: https://issues.apache.org/jira/browse/SPARK-41728 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41728) Implement `unwrap_udt` function
[ https://issues.apache.org/jira/browse/SPARK-41728?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652164#comment-17652164 ] Apache Spark commented on SPARK-41728: -- User 'zhengruifeng' has created a pull request for this issue: https://github.com/apache/spark/pull/39234 > Implement `unwrap_udt` function > --- > > Key: SPARK-41728 > URL: https://issues.apache.org/jira/browse/SPARK-41728 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41728) Implement `unwrap_udt` function
[ https://issues.apache.org/jira/browse/SPARK-41728?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41728: Assignee: (was: Apache Spark) > Implement `unwrap_udt` function > --- > > Key: SPARK-41728 > URL: https://issues.apache.org/jira/browse/SPARK-41728 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41728) Implement `unwrap_udt` function
Ruifeng Zheng created SPARK-41728: - Summary: Implement `unwrap_udt` function Key: SPARK-41728 URL: https://issues.apache.org/jira/browse/SPARK-41728 Project: Spark Issue Type: Sub-task Components: Connect, PySpark Affects Versions: 3.4.0 Reporter: Ruifeng Zheng -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41676) Protobuf serializer for StreamingQueryData
[ https://issues.apache.org/jira/browse/SPARK-41676?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41676: Assignee: (was: Apache Spark) > Protobuf serializer for StreamingQueryData > -- > > Key: SPARK-41676 > URL: https://issues.apache.org/jira/browse/SPARK-41676 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41676) Protobuf serializer for StreamingQueryData
[ https://issues.apache.org/jira/browse/SPARK-41676?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41676: Assignee: Apache Spark > Protobuf serializer for StreamingQueryData > -- > > Key: SPARK-41676 > URL: https://issues.apache.org/jira/browse/SPARK-41676 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41676) Protobuf serializer for StreamingQueryData
[ https://issues.apache.org/jira/browse/SPARK-41676?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652155#comment-17652155 ] Apache Spark commented on SPARK-41676: -- User 'LuciferYang' has created a pull request for this issue: https://github.com/apache/spark/pull/39233 > Protobuf serializer for StreamingQueryData > -- > > Key: SPARK-41676 > URL: https://issues.apache.org/jira/browse/SPARK-41676 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41676) Protobuf serializer for StreamingQueryData
[ https://issues.apache.org/jira/browse/SPARK-41676?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652154#comment-17652154 ] Apache Spark commented on SPARK-41676: -- User 'LuciferYang' has created a pull request for this issue: https://github.com/apache/spark/pull/39233 > Protobuf serializer for StreamingQueryData > -- > > Key: SPARK-41676 > URL: https://issues.apache.org/jira/browse/SPARK-41676 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41706) pyspark_types_to_proto_types should supports MapType
[ https://issues.apache.org/jira/browse/SPARK-41706?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng resolved SPARK-41706. --- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39213 [https://github.com/apache/spark/pull/39213] > pyspark_types_to_proto_types should supports MapType > > > Key: SPARK-41706 > URL: https://issues.apache.org/jira/browse/SPARK-41706 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: jiaan.geng >Assignee: jiaan.geng >Priority: Major > Fix For: 3.4.0 > > > pyspark_types_to_proto_types doesn't support MapType now. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41706) pyspark_types_to_proto_types should supports MapType
[ https://issues.apache.org/jira/browse/SPARK-41706?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng reassigned SPARK-41706: - Assignee: jiaan.geng > pyspark_types_to_proto_types should supports MapType > > > Key: SPARK-41706 > URL: https://issues.apache.org/jira/browse/SPARK-41706 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: jiaan.geng >Assignee: jiaan.geng >Priority: Major > > pyspark_types_to_proto_types doesn't support MapType now. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41724) Implement `call_udf` function
[ https://issues.apache.org/jira/browse/SPARK-41724?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng resolved SPARK-41724. --- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39230 [https://github.com/apache/spark/pull/39230] > Implement `call_udf` function > - > > Key: SPARK-41724 > URL: https://issues.apache.org/jira/browse/SPARK-41724 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41724) Implement `call_udf` function
[ https://issues.apache.org/jira/browse/SPARK-41724?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng reassigned SPARK-41724: - Assignee: Ruifeng Zheng > Implement `call_udf` function > - > > Key: SPARK-41724 > URL: https://issues.apache.org/jira/browse/SPARK-41724 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41473) Implement `functions.format_number`
[ https://issues.apache.org/jira/browse/SPARK-41473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng reassigned SPARK-41473: - Assignee: Ruifeng Zheng > Implement `functions.format_number` > --- > > Key: SPARK-41473 > URL: https://issues.apache.org/jira/browse/SPARK-41473 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Xinrong Meng >Assignee: Ruifeng Zheng >Priority: Major > Fix For: 3.4.0 > > > Currently, `functions.format_number` has data type mismatch issue as shown > below: > {code:python} > >>> from pyspark.sql.connect.session import SparkSession as RemoteSparkSession > >>> cspark = RemoteSparkSession.builder.remote().getOrCreate() > >>> from pyspark.sql.connect import functions as CF > >>> cspark.createDataFrame([(5,)], ['a']).select(CF.format_number('a', 4)) > 22/12/09 14:46:52 ERROR SparkConnectService: Error analyzing plan. > org.apache.spark.sql.AnalysisException: > [DATATYPE_MISMATCH.UNEXPECTED_INPUT_TYPE] Cannot resolve "format_number(a, > 4)" due to data type mismatch: Parameter 2 requires the ("INT" or "STRING") > type, however "4" has the type "BIGINT".; > 'Project [unresolvedalias(format_number(a#2L, 4), None)] > +- Project [0#0L AS a#2L] >+- LocalRelation [0#0L] > ... > {code} > The jira aims to fix the data type mismatch issue and implement > `functions.format_number`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41473) Implement `functions.format_number`
[ https://issues.apache.org/jira/browse/SPARK-41473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng resolved SPARK-41473. --- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39229 [https://github.com/apache/spark/pull/39229] > Implement `functions.format_number` > --- > > Key: SPARK-41473 > URL: https://issues.apache.org/jira/browse/SPARK-41473 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Xinrong Meng >Priority: Major > Fix For: 3.4.0 > > > Currently, `functions.format_number` has data type mismatch issue as shown > below: > {code:python} > >>> from pyspark.sql.connect.session import SparkSession as RemoteSparkSession > >>> cspark = RemoteSparkSession.builder.remote().getOrCreate() > >>> from pyspark.sql.connect import functions as CF > >>> cspark.createDataFrame([(5,)], ['a']).select(CF.format_number('a', 4)) > 22/12/09 14:46:52 ERROR SparkConnectService: Error analyzing plan. > org.apache.spark.sql.AnalysisException: > [DATATYPE_MISMATCH.UNEXPECTED_INPUT_TYPE] Cannot resolve "format_number(a, > 4)" due to data type mismatch: Parameter 2 requires the ("INT" or "STRING") > type, however "4" has the type "BIGINT".; > 'Project [unresolvedalias(format_number(a#2L, 4), None)] > +- Project [0#0L AS a#2L] >+- LocalRelation [0#0L] > ... > {code} > The jira aims to fix the data type mismatch issue and implement > `functions.format_number`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41723) Implement `sequence` function
[ https://issues.apache.org/jira/browse/SPARK-41723?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng reassigned SPARK-41723: - Assignee: Ruifeng Zheng > Implement `sequence` function > - > > Key: SPARK-41723 > URL: https://issues.apache.org/jira/browse/SPARK-41723 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41723) Implement `sequence` function
[ https://issues.apache.org/jira/browse/SPARK-41723?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng resolved SPARK-41723. --- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39228 [https://github.com/apache/spark/pull/39228] > Implement `sequence` function > - > > Key: SPARK-41723 > URL: https://issues.apache.org/jira/browse/SPARK-41723 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41378) Support Column Stats in DS V2
[ https://issues.apache.org/jira/browse/SPARK-41378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652137#comment-17652137 ] Apache Spark commented on SPARK-41378: -- User 'LuciferYang' has created a pull request for this issue: https://github.com/apache/spark/pull/39232 > Support Column Stats in DS V2 > - > > Key: SPARK-41378 > URL: https://issues.apache.org/jira/browse/SPARK-41378 > Project: Spark > Issue Type: New Feature > Components: SQL >Affects Versions: 3.4.0 >Reporter: Huaxin Gao >Assignee: Huaxin Gao >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41378) Support Column Stats in DS V2
[ https://issues.apache.org/jira/browse/SPARK-41378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652136#comment-17652136 ] Apache Spark commented on SPARK-41378: -- User 'LuciferYang' has created a pull request for this issue: https://github.com/apache/spark/pull/39232 > Support Column Stats in DS V2 > - > > Key: SPARK-41378 > URL: https://issues.apache.org/jira/browse/SPARK-41378 > Project: Spark > Issue Type: New Feature > Components: SQL >Affects Versions: 3.4.0 >Reporter: Huaxin Gao >Assignee: Huaxin Gao >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-41727) ClassCastException when config spark.sql.hive.metastore* properties under jdk17
[ https://issues.apache.org/jira/browse/SPARK-41727?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] kevinshin updated SPARK-41727: -- Attachment: hms-init-error.txt > ClassCastException when config spark.sql.hive.metastore* properties under > jdk17 > --- > > Key: SPARK-41727 > URL: https://issues.apache.org/jira/browse/SPARK-41727 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.3.1 > Environment: Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0 >Reporter: kevinshin >Priority: Critical > Attachments: hms-init-error.txt > > > Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0 > when config properties about spark.sql.hive.metastore* to use > hive.metastore.version 3.1.2: > *spark.sql.hive.metastore.jars /data/soft/spark3/standalone-metastore/** > *spark.sql.hive.metastore.version 3.1.2* > then start spark-shell with master = local[*] under jdk17 > try to select a hive table, will got error: > 13:44:52.428 [main] ERROR > org.apache.hadoop.hive.metastore.utils.MetaStoreUtils - Got exception: > java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to > class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module > java.base of loader 'bootstrap') > java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to > class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module > java.base of loader 'bootstrap') > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:262) > ~[hive-standalone-metastore-3.1.2.jar:3.1.2] > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-41727) ClassCastException when config spark.sql.hive.metastore* properties under jdk17
[ https://issues.apache.org/jira/browse/SPARK-41727?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] kevinshin updated SPARK-41727: -- Description: Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0 when config properties about spark.sql.hive.metastore* to use hive.metastore.version 3.1.2: *spark.sql.hive.metastore.jars /data/soft/spark3/standalone-metastore/** *spark.sql.hive.metastore.version 3.1.2* then start spark-shell with master = local[*] under jdk17 try to select a hive table, will got error: 13:44:52.428 [main] ERROR org.apache.hadoop.hive.metastore.utils.MetaStoreUtils - Got exception: java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:262) ~[hive-standalone-metastore-3.1.2.jar:3.1.2] was: Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0 when config properties about spark.sql.hive.metastore* to use hive.metastore.version 3.1.2 !image-2022-12-27-13-54-18-238.png! start spark-shell with master = local[*] under jdk17 try to select a hive table, will got error: 13:44:52.428 [main] ERROR org.apache.hadoop.hive.metastore.utils.MetaStoreUtils - Got exception: java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:262) ~[hive-standalone-metastore-3.1.2.jar:3.1.2] > ClassCastException when config spark.sql.hive.metastore* properties under > jdk17 > --- > > Key: SPARK-41727 > URL: https://issues.apache.org/jira/browse/SPARK-41727 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.3.1 > Environment: Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0 >Reporter: kevinshin >Priority: Critical > > Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0 > when config properties about spark.sql.hive.metastore* to use > hive.metastore.version 3.1.2: > *spark.sql.hive.metastore.jars /data/soft/spark3/standalone-metastore/** > *spark.sql.hive.metastore.version 3.1.2* > then start spark-shell with master = local[*] under jdk17 > try to select a hive table, will got error: > 13:44:52.428 [main] ERROR > org.apache.hadoop.hive.metastore.utils.MetaStoreUtils - Got exception: > java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to > class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module > java.base of loader 'bootstrap') > java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to > class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module > java.base of loader 'bootstrap') > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:262) > ~[hive-standalone-metastore-3.1.2.jar:3.1.2] > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41727) ClassCastException when config spark.sql.hive.metastore* properties under jdk17
kevinshin created SPARK-41727: - Summary: ClassCastException when config spark.sql.hive.metastore* properties under jdk17 Key: SPARK-41727 URL: https://issues.apache.org/jira/browse/SPARK-41727 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 3.3.1 Environment: Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0 Reporter: kevinshin Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0 when config properties about spark.sql.hive.metastore* to use hive.metastore.version 3.1.2 !image-2022-12-27-13-54-18-238.png! start spark-shell with master = local[*] under jdk17 try to select a hive table, will got error: 13:44:52.428 [main] ERROR org.apache.hadoop.hive.metastore.utils.MetaStoreUtils - Got exception: java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:262) ~[hive-standalone-metastore-3.1.2.jar:3.1.2] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41726) Remove OptimizedCreateHiveTableAsSelectCommand
XiDuo You created SPARK-41726: - Summary: Remove OptimizedCreateHiveTableAsSelectCommand Key: SPARK-41726 URL: https://issues.apache.org/jira/browse/SPARK-41726 Project: Spark Issue Type: Improvement Components: SQL Affects Versions: 3.4.0 Reporter: XiDuo You CTAS use a nested execution to do data writing, so it is unnecessary to have OptimizedCreateHiveTableAsSelectCommand. The inside InsertIntoHiveTable would be converted to InsertIntoHadoopFsRelationCommand if possible. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41529) Implement SparkSession.stop
[ https://issues.apache.org/jira/browse/SPARK-41529?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652127#comment-17652127 ] Apache Spark commented on SPARK-41529: -- User 'HyukjinKwon' has created a pull request for this issue: https://github.com/apache/spark/pull/39231 > Implement SparkSession.stop > --- > > Key: SPARK-41529 > URL: https://issues.apache.org/jira/browse/SPARK-41529 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41529) Implement SparkSession.stop
[ https://issues.apache.org/jira/browse/SPARK-41529?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41529: Assignee: (was: Apache Spark) > Implement SparkSession.stop > --- > > Key: SPARK-41529 > URL: https://issues.apache.org/jira/browse/SPARK-41529 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41529) Implement SparkSession.stop
[ https://issues.apache.org/jira/browse/SPARK-41529?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41529: Assignee: Apache Spark > Implement SparkSession.stop > --- > > Key: SPARK-41529 > URL: https://issues.apache.org/jira/browse/SPARK-41529 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41529) Implement SparkSession.stop
[ https://issues.apache.org/jira/browse/SPARK-41529?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652126#comment-17652126 ] Apache Spark commented on SPARK-41529: -- User 'HyukjinKwon' has created a pull request for this issue: https://github.com/apache/spark/pull/39231 > Implement SparkSession.stop > --- > > Key: SPARK-41529 > URL: https://issues.apache.org/jira/browse/SPARK-41529 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41725) Remove the workaround of sql(...).collect back in PySpark tests
Hyukjin Kwon created SPARK-41725: Summary: Remove the workaround of sql(...).collect back in PySpark tests Key: SPARK-41725 URL: https://issues.apache.org/jira/browse/SPARK-41725 Project: Spark Issue Type: Sub-task Components: Connect, PySpark, Tests Affects Versions: 3.4.0 Reporter: Hyukjin Kwon See https://github.com/apache/spark/pull/39224/files#r1057436437 We don't have to `collect` for every `sql` but Spark Connect requires it. We should remove them out. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41722) Implement time window functions
[ https://issues.apache.org/jira/browse/SPARK-41722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng resolved SPARK-41722. --- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39227 [https://github.com/apache/spark/pull/39227] > Implement time window functions > --- > > Key: SPARK-41722 > URL: https://issues.apache.org/jira/browse/SPARK-41722 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41722) Implement time window functions
[ https://issues.apache.org/jira/browse/SPARK-41722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng reassigned SPARK-41722: - Assignee: Ruifeng Zheng > Implement time window functions > --- > > Key: SPARK-41722 > URL: https://issues.apache.org/jira/browse/SPARK-41722 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41653) Test parity: enable doctests in Spark Connect
[ https://issues.apache.org/jira/browse/SPARK-41653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652119#comment-17652119 ] Hyukjin Kwon commented on SPARK-41653: -- Please go ahead. Thanks for taking a look [~techaddict] > Test parity: enable doctests in Spark Connect > - > > Key: SPARK-41653 > URL: https://issues.apache.org/jira/browse/SPARK-41653 > Project: Spark > Issue Type: Umbrella > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > > We should actually run the doctests of Spark Connect. > We should add something like > https://github.com/apache/spark/blob/master/python/pyspark/sql/column.py#L1227-L1247 > to Spark Connect modules, and add the module into > https://github.com/apache/spark/blob/master/dev/sparktestsupport/modules.py#L507 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41724) Implement `call_udf` function
[ https://issues.apache.org/jira/browse/SPARK-41724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652117#comment-17652117 ] Apache Spark commented on SPARK-41724: -- User 'zhengruifeng' has created a pull request for this issue: https://github.com/apache/spark/pull/39230 > Implement `call_udf` function > - > > Key: SPARK-41724 > URL: https://issues.apache.org/jira/browse/SPARK-41724 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41724) Implement `call_udf` function
[ https://issues.apache.org/jira/browse/SPARK-41724?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41724: Assignee: Apache Spark > Implement `call_udf` function > - > > Key: SPARK-41724 > URL: https://issues.apache.org/jira/browse/SPARK-41724 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41724) Implement `call_udf` function
[ https://issues.apache.org/jira/browse/SPARK-41724?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41724: Assignee: (was: Apache Spark) > Implement `call_udf` function > - > > Key: SPARK-41724 > URL: https://issues.apache.org/jira/browse/SPARK-41724 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41724) Implement `call_udf` function
Ruifeng Zheng created SPARK-41724: - Summary: Implement `call_udf` function Key: SPARK-41724 URL: https://issues.apache.org/jira/browse/SPARK-41724 Project: Spark Issue Type: Sub-task Components: Connect, PySpark Affects Versions: 3.4.0 Reporter: Ruifeng Zheng -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41473) Implement `functions.format_number`
[ https://issues.apache.org/jira/browse/SPARK-41473?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652108#comment-17652108 ] Apache Spark commented on SPARK-41473: -- User 'zhengruifeng' has created a pull request for this issue: https://github.com/apache/spark/pull/39229 > Implement `functions.format_number` > --- > > Key: SPARK-41473 > URL: https://issues.apache.org/jira/browse/SPARK-41473 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Xinrong Meng >Priority: Major > > Currently, `functions.format_number` has data type mismatch issue as shown > below: > {code:python} > >>> from pyspark.sql.connect.session import SparkSession as RemoteSparkSession > >>> cspark = RemoteSparkSession.builder.remote().getOrCreate() > >>> from pyspark.sql.connect import functions as CF > >>> cspark.createDataFrame([(5,)], ['a']).select(CF.format_number('a', 4)) > 22/12/09 14:46:52 ERROR SparkConnectService: Error analyzing plan. > org.apache.spark.sql.AnalysisException: > [DATATYPE_MISMATCH.UNEXPECTED_INPUT_TYPE] Cannot resolve "format_number(a, > 4)" due to data type mismatch: Parameter 2 requires the ("INT" or "STRING") > type, however "4" has the type "BIGINT".; > 'Project [unresolvedalias(format_number(a#2L, 4), None)] > +- Project [0#0L AS a#2L] >+- LocalRelation [0#0L] > ... > {code} > The jira aims to fix the data type mismatch issue and implement > `functions.format_number`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41473) Implement `functions.format_number`
[ https://issues.apache.org/jira/browse/SPARK-41473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41473: Assignee: (was: Apache Spark) > Implement `functions.format_number` > --- > > Key: SPARK-41473 > URL: https://issues.apache.org/jira/browse/SPARK-41473 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Xinrong Meng >Priority: Major > > Currently, `functions.format_number` has data type mismatch issue as shown > below: > {code:python} > >>> from pyspark.sql.connect.session import SparkSession as RemoteSparkSession > >>> cspark = RemoteSparkSession.builder.remote().getOrCreate() > >>> from pyspark.sql.connect import functions as CF > >>> cspark.createDataFrame([(5,)], ['a']).select(CF.format_number('a', 4)) > 22/12/09 14:46:52 ERROR SparkConnectService: Error analyzing plan. > org.apache.spark.sql.AnalysisException: > [DATATYPE_MISMATCH.UNEXPECTED_INPUT_TYPE] Cannot resolve "format_number(a, > 4)" due to data type mismatch: Parameter 2 requires the ("INT" or "STRING") > type, however "4" has the type "BIGINT".; > 'Project [unresolvedalias(format_number(a#2L, 4), None)] > +- Project [0#0L AS a#2L] >+- LocalRelation [0#0L] > ... > {code} > The jira aims to fix the data type mismatch issue and implement > `functions.format_number`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41473) Implement `functions.format_number`
[ https://issues.apache.org/jira/browse/SPARK-41473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41473: Assignee: Apache Spark > Implement `functions.format_number` > --- > > Key: SPARK-41473 > URL: https://issues.apache.org/jira/browse/SPARK-41473 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Xinrong Meng >Assignee: Apache Spark >Priority: Major > > Currently, `functions.format_number` has data type mismatch issue as shown > below: > {code:python} > >>> from pyspark.sql.connect.session import SparkSession as RemoteSparkSession > >>> cspark = RemoteSparkSession.builder.remote().getOrCreate() > >>> from pyspark.sql.connect import functions as CF > >>> cspark.createDataFrame([(5,)], ['a']).select(CF.format_number('a', 4)) > 22/12/09 14:46:52 ERROR SparkConnectService: Error analyzing plan. > org.apache.spark.sql.AnalysisException: > [DATATYPE_MISMATCH.UNEXPECTED_INPUT_TYPE] Cannot resolve "format_number(a, > 4)" due to data type mismatch: Parameter 2 requires the ("INT" or "STRING") > type, however "4" has the type "BIGINT".; > 'Project [unresolvedalias(format_number(a#2L, 4), None)] > +- Project [0#0L AS a#2L] >+- LocalRelation [0#0L] > ... > {code} > The jira aims to fix the data type mismatch issue and implement > `functions.format_number`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41473) Implement `functions.format_number`
[ https://issues.apache.org/jira/browse/SPARK-41473?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652107#comment-17652107 ] Apache Spark commented on SPARK-41473: -- User 'zhengruifeng' has created a pull request for this issue: https://github.com/apache/spark/pull/39229 > Implement `functions.format_number` > --- > > Key: SPARK-41473 > URL: https://issues.apache.org/jira/browse/SPARK-41473 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Xinrong Meng >Priority: Major > > Currently, `functions.format_number` has data type mismatch issue as shown > below: > {code:python} > >>> from pyspark.sql.connect.session import SparkSession as RemoteSparkSession > >>> cspark = RemoteSparkSession.builder.remote().getOrCreate() > >>> from pyspark.sql.connect import functions as CF > >>> cspark.createDataFrame([(5,)], ['a']).select(CF.format_number('a', 4)) > 22/12/09 14:46:52 ERROR SparkConnectService: Error analyzing plan. > org.apache.spark.sql.AnalysisException: > [DATATYPE_MISMATCH.UNEXPECTED_INPUT_TYPE] Cannot resolve "format_number(a, > 4)" due to data type mismatch: Parameter 2 requires the ("INT" or "STRING") > type, however "4" has the type "BIGINT".; > 'Project [unresolvedalias(format_number(a#2L, 4), None)] > +- Project [0#0L AS a#2L] >+- LocalRelation [0#0L] > ... > {code} > The jira aims to fix the data type mismatch issue and implement > `functions.format_number`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41235) High-order function: array_compact
[ https://issues.apache.org/jira/browse/SPARK-41235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wenchen Fan reassigned SPARK-41235: --- Assignee: Sandeep Katta > High-order function: array_compact > -- > > Key: SPARK-41235 > URL: https://issues.apache.org/jira/browse/SPARK-41235 > Project: Spark > Issue Type: Sub-task > Components: PySpark, SQL >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Sandeep Katta >Priority: Major > > refer to > https://docs.snowflake.com/en/developer-guide/snowpark/reference/python/api/snowflake.snowpark.functions.array_compact.html -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41235) High-order function: array_compact
[ https://issues.apache.org/jira/browse/SPARK-41235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wenchen Fan resolved SPARK-41235. - Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 38874 [https://github.com/apache/spark/pull/38874] > High-order function: array_compact > -- > > Key: SPARK-41235 > URL: https://issues.apache.org/jira/browse/SPARK-41235 > Project: Spark > Issue Type: Sub-task > Components: PySpark, SQL >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Sandeep Katta >Priority: Major > Fix For: 3.4.0 > > > refer to > https://docs.snowflake.com/en/developer-guide/snowpark/reference/python/api/snowflake.snowpark.functions.array_compact.html -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41723) Implement `sequence` function
[ https://issues.apache.org/jira/browse/SPARK-41723?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41723: Assignee: (was: Apache Spark) > Implement `sequence` function > - > > Key: SPARK-41723 > URL: https://issues.apache.org/jira/browse/SPARK-41723 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41723) Implement `sequence` function
[ https://issues.apache.org/jira/browse/SPARK-41723?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41723: Assignee: Apache Spark > Implement `sequence` function > - > > Key: SPARK-41723 > URL: https://issues.apache.org/jira/browse/SPARK-41723 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41723) Implement `sequence` function
[ https://issues.apache.org/jira/browse/SPARK-41723?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652105#comment-17652105 ] Apache Spark commented on SPARK-41723: -- User 'zhengruifeng' has created a pull request for this issue: https://github.com/apache/spark/pull/39228 > Implement `sequence` function > - > > Key: SPARK-41723 > URL: https://issues.apache.org/jira/browse/SPARK-41723 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41723) Implement `sequence` function
Ruifeng Zheng created SPARK-41723: - Summary: Implement `sequence` function Key: SPARK-41723 URL: https://issues.apache.org/jira/browse/SPARK-41723 Project: Spark Issue Type: Sub-task Components: Connect, PySpark Affects Versions: 3.4.0 Reporter: Ruifeng Zheng -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41690) Introduce AgnosticEncoders
[ https://issues.apache.org/jira/browse/SPARK-41690?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wenchen Fan resolved SPARK-41690. - Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39186 [https://github.com/apache/spark/pull/39186] > Introduce AgnosticEncoders > -- > > Key: SPARK-41690 > URL: https://issues.apache.org/jira/browse/SPARK-41690 > Project: Spark > Issue Type: Task > Components: Connect, SQL >Affects Versions: 3.4.0 >Reporter: Herman van Hövell >Assignee: Herman van Hövell >Priority: Major > Fix For: 3.4.0 > > > For the Spark Connect Scala Client we need Encoders. We cannot use the > current ExpressionEncoder infrastructure because of the following reasons: > * Connect uses Arrow instead of internal rows > * We don't want to have a dependency on Catalyst in the client. > * We don't want to be sending the expressions that expression encoders use > over the wire. > An additional challenge is that we cannot send TypeTags over the wire. > The proposal is to create an intermediate encoder. This encoder describes > exactly how which external type maps to an internal type, but it does not > prescribe which format is used. > An upside of this approach is that we can reduce the cost of encoder creation > by quite a margin. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-41722) Implement time window functions
[ https://issues.apache.org/jira/browse/SPARK-41722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ruifeng Zheng updated SPARK-41722: -- Summary: Implement time window functions (was: Implement 3 missing time window functions) > Implement time window functions > --- > > Key: SPARK-41722 > URL: https://issues.apache.org/jira/browse/SPARK-41722 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41722) Implement 3 missing time window functions
[ https://issues.apache.org/jira/browse/SPARK-41722?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652099#comment-17652099 ] Apache Spark commented on SPARK-41722: -- User 'zhengruifeng' has created a pull request for this issue: https://github.com/apache/spark/pull/39227 > Implement 3 missing time window functions > - > > Key: SPARK-41722 > URL: https://issues.apache.org/jira/browse/SPARK-41722 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41722) Implement 3 missing time window functions
[ https://issues.apache.org/jira/browse/SPARK-41722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41722: Assignee: Apache Spark > Implement 3 missing time window functions > - > > Key: SPARK-41722 > URL: https://issues.apache.org/jira/browse/SPARK-41722 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41722) Implement 3 missing time window functions
[ https://issues.apache.org/jira/browse/SPARK-41722?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652098#comment-17652098 ] Apache Spark commented on SPARK-41722: -- User 'zhengruifeng' has created a pull request for this issue: https://github.com/apache/spark/pull/39227 > Implement 3 missing time window functions > - > > Key: SPARK-41722 > URL: https://issues.apache.org/jira/browse/SPARK-41722 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41722) Implement 3 missing time window functions
[ https://issues.apache.org/jira/browse/SPARK-41722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41722: Assignee: (was: Apache Spark) > Implement 3 missing time window functions > - > > Key: SPARK-41722 > URL: https://issues.apache.org/jira/browse/SPARK-41722 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 3.4.0 >Reporter: Ruifeng Zheng >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41694) Add new config to clean up `spark.ui.store.path` directory when SparkContext.stop()
[ https://issues.apache.org/jira/browse/SPARK-41694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41694: Assignee: Apache Spark > Add new config to clean up `spark.ui.store.path` directory when > SparkContext.stop() > --- > > Key: SPARK-41694 > URL: https://issues.apache.org/jira/browse/SPARK-41694 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Apache Spark >Priority: Major > > {{spark.ui.store.path}} directory not clean up when {{SparkContext.stop() > now:}} > # {{{}{}}}The disk space occupied by the {{spark.ui.store.path}} directory > will continue to grow. > # When submitting new App and reusing the {{spark.ui.store.path}} directory, > we will see the content related to the previous App, which is a bit strange -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41694) Add new config to clean up `spark.ui.store.path` directory when SparkContext.stop()
[ https://issues.apache.org/jira/browse/SPARK-41694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41694: Assignee: (was: Apache Spark) > Add new config to clean up `spark.ui.store.path` directory when > SparkContext.stop() > --- > > Key: SPARK-41694 > URL: https://issues.apache.org/jira/browse/SPARK-41694 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Major > > {{spark.ui.store.path}} directory not clean up when {{SparkContext.stop() > now:}} > # {{{}{}}}The disk space occupied by the {{spark.ui.store.path}} directory > will continue to grow. > # When submitting new App and reusing the {{spark.ui.store.path}} directory, > we will see the content related to the previous App, which is a bit strange -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41694) Add new config to clean up `spark.ui.store.path` directory when SparkContext.stop()
[ https://issues.apache.org/jira/browse/SPARK-41694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652096#comment-17652096 ] Apache Spark commented on SPARK-41694: -- User 'LuciferYang' has created a pull request for this issue: https://github.com/apache/spark/pull/39226 > Add new config to clean up `spark.ui.store.path` directory when > SparkContext.stop() > --- > > Key: SPARK-41694 > URL: https://issues.apache.org/jira/browse/SPARK-41694 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Major > > {{spark.ui.store.path}} directory not clean up when {{SparkContext.stop() > now:}} > # {{{}{}}}The disk space occupied by the {{spark.ui.store.path}} directory > will continue to grow. > # When submitting new App and reusing the {{spark.ui.store.path}} directory, > we will see the content related to the previous App, which is a bit strange -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41694) Add new config to clean up `spark.ui.store.path` directory when SparkContext.stop()
[ https://issues.apache.org/jira/browse/SPARK-41694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652095#comment-17652095 ] Apache Spark commented on SPARK-41694: -- User 'LuciferYang' has created a pull request for this issue: https://github.com/apache/spark/pull/39226 > Add new config to clean up `spark.ui.store.path` directory when > SparkContext.stop() > --- > > Key: SPARK-41694 > URL: https://issues.apache.org/jira/browse/SPARK-41694 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 3.4.0 >Reporter: Yang Jie >Priority: Major > > {{spark.ui.store.path}} directory not clean up when {{SparkContext.stop() > now:}} > # {{{}{}}}The disk space occupied by the {{spark.ui.store.path}} directory > will continue to grow. > # When submitting new App and reusing the {{spark.ui.store.path}} directory, > we will see the content related to the previous App, which is a bit strange -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41722) Implement 3 missing time window functions
Ruifeng Zheng created SPARK-41722: - Summary: Implement 3 missing time window functions Key: SPARK-41722 URL: https://issues.apache.org/jira/browse/SPARK-41722 Project: Spark Issue Type: Sub-task Components: Connect, PySpark Affects Versions: 3.4.0 Reporter: Ruifeng Zheng -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41653) Test parity: enable doctests in Spark Connect
[ https://issues.apache.org/jira/browse/SPARK-41653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652074#comment-17652074 ] Sandeep Singh commented on SPARK-41653: --- [~gurwls223] I can work on some of these > Test parity: enable doctests in Spark Connect > - > > Key: SPARK-41653 > URL: https://issues.apache.org/jira/browse/SPARK-41653 > Project: Spark > Issue Type: Umbrella > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > > We should actually run the doctests of Spark Connect. > We should add something like > https://github.com/apache/spark/blob/master/python/pyspark/sql/column.py#L1227-L1247 > to Spark Connect modules, and add the module into > https://github.com/apache/spark/blob/master/dev/sparktestsupport/modules.py#L507 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41654) Enable doctests in pyspark.sql.connect.window
[ https://issues.apache.org/jira/browse/SPARK-41654?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41654: Assignee: (was: Apache Spark) > Enable doctests in pyspark.sql.connect.window > - > > Key: SPARK-41654 > URL: https://issues.apache.org/jira/browse/SPARK-41654 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41654) Enable doctests in pyspark.sql.connect.window
[ https://issues.apache.org/jira/browse/SPARK-41654?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41654: Assignee: Apache Spark > Enable doctests in pyspark.sql.connect.window > - > > Key: SPARK-41654 > URL: https://issues.apache.org/jira/browse/SPARK-41654 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41654) Enable doctests in pyspark.sql.connect.window
[ https://issues.apache.org/jira/browse/SPARK-41654?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652073#comment-17652073 ] Apache Spark commented on SPARK-41654: -- User 'techaddict' has created a pull request for this issue: https://github.com/apache/spark/pull/39225 > Enable doctests in pyspark.sql.connect.window > - > > Key: SPARK-41654 > URL: https://issues.apache.org/jira/browse/SPARK-41654 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41717) Implement the command logic for print and _repr_html_
[ https://issues.apache.org/jira/browse/SPARK-41717?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon reassigned SPARK-41717: Assignee: Hyukjin Kwon > Implement the command logic for print and _repr_html_ > - > > Key: SPARK-41717 > URL: https://issues.apache.org/jira/browse/SPARK-41717 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > > We should deduplicate the implementation of print and _repr_html_ at > pyspark.sql.connect.plan.LogicalPlan -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41717) Implement the command logic for print and _repr_html_
[ https://issues.apache.org/jira/browse/SPARK-41717?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon resolved SPARK-41717. -- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39223 [https://github.com/apache/spark/pull/39223] > Implement the command logic for print and _repr_html_ > - > > Key: SPARK-41717 > URL: https://issues.apache.org/jira/browse/SPARK-41717 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > Fix For: 3.4.0 > > > We should deduplicate the implementation of print and _repr_html_ at > pyspark.sql.connect.plan.LogicalPlan -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41721) Enable doctests in pyspark.sql.connect.catalog
[ https://issues.apache.org/jira/browse/SPARK-41721?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41721: Assignee: (was: Apache Spark) > Enable doctests in pyspark.sql.connect.catalog > -- > > Key: SPARK-41721 > URL: https://issues.apache.org/jira/browse/SPARK-41721 > Project: Spark > Issue Type: Sub-task > Components: Connect, Tests >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41721) Enable doctests in pyspark.sql.connect.catalog
[ https://issues.apache.org/jira/browse/SPARK-41721?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652071#comment-17652071 ] Apache Spark commented on SPARK-41721: -- User 'HyukjinKwon' has created a pull request for this issue: https://github.com/apache/spark/pull/39224 > Enable doctests in pyspark.sql.connect.catalog > -- > > Key: SPARK-41721 > URL: https://issues.apache.org/jira/browse/SPARK-41721 > Project: Spark > Issue Type: Sub-task > Components: Connect, Tests >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41721) Enable doctests in pyspark.sql.connect.catalog
[ https://issues.apache.org/jira/browse/SPARK-41721?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41721: Assignee: Apache Spark > Enable doctests in pyspark.sql.connect.catalog > -- > > Key: SPARK-41721 > URL: https://issues.apache.org/jira/browse/SPARK-41721 > Project: Spark > Issue Type: Sub-task > Components: Connect, Tests >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41721) Enable doctests in pyspark.sql.connect.catalog
Hyukjin Kwon created SPARK-41721: Summary: Enable doctests in pyspark.sql.connect.catalog Key: SPARK-41721 URL: https://issues.apache.org/jira/browse/SPARK-41721 Project: Spark Issue Type: Sub-task Components: Connect, Tests Affects Versions: 3.4.0 Reporter: Hyukjin Kwon -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41720) Rename UnresolvedFunc to UnresolvedFunctionName
[ https://issues.apache.org/jira/browse/SPARK-41720?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon reassigned SPARK-41720: Assignee: Wenchen Fan > Rename UnresolvedFunc to UnresolvedFunctionName > --- > > Key: SPARK-41720 > URL: https://issues.apache.org/jira/browse/SPARK-41720 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 3.4.0 >Reporter: Wenchen Fan >Assignee: Wenchen Fan >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41720) Rename UnresolvedFunc to UnresolvedFunctionName
[ https://issues.apache.org/jira/browse/SPARK-41720?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon resolved SPARK-41720. -- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39222 [https://github.com/apache/spark/pull/39222] > Rename UnresolvedFunc to UnresolvedFunctionName > --- > > Key: SPARK-41720 > URL: https://issues.apache.org/jira/browse/SPARK-41720 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 3.4.0 >Reporter: Wenchen Fan >Assignee: Wenchen Fan >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41711) Upgrade protobuf-java to 3.21.12
[ https://issues.apache.org/jira/browse/SPARK-41711?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon reassigned SPARK-41711: Assignee: Yang Jie > Upgrade protobuf-java to 3.21.12 > > > Key: SPARK-41711 > URL: https://issues.apache.org/jira/browse/SPARK-41711 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > > https://github.com/protocolbuffers/protobuf/releases/tag/v21.12 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41711) Upgrade protobuf-java to 3.21.12
[ https://issues.apache.org/jira/browse/SPARK-41711?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon resolved SPARK-41711. -- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39217 [https://github.com/apache/spark/pull/39217] > Upgrade protobuf-java to 3.21.12 > > > Key: SPARK-41711 > URL: https://issues.apache.org/jira/browse/SPARK-41711 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Fix For: 3.4.0 > > > https://github.com/protocolbuffers/protobuf/releases/tag/v21.12 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-41707) Implement initial Catalog.* API
[ https://issues.apache.org/jira/browse/SPARK-41707?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon resolved SPARK-41707. -- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 39214 [https://github.com/apache/spark/pull/39214] > Implement initial Catalog.* API > --- > > Key: SPARK-41707 > URL: https://issues.apache.org/jira/browse/SPARK-41707 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Major > Fix For: 3.4.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41717) Implement the command logic for print and _repr_html_
[ https://issues.apache.org/jira/browse/SPARK-41717?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41717: Assignee: (was: Apache Spark) > Implement the command logic for print and _repr_html_ > - > > Key: SPARK-41717 > URL: https://issues.apache.org/jira/browse/SPARK-41717 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > > We should deduplicate the implementation of print and _repr_html_ at > pyspark.sql.connect.plan.LogicalPlan -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41717) Implement the command logic for print and _repr_html_
[ https://issues.apache.org/jira/browse/SPARK-41717?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652018#comment-17652018 ] Apache Spark commented on SPARK-41717: -- User 'HyukjinKwon' has created a pull request for this issue: https://github.com/apache/spark/pull/39223 > Implement the command logic for print and _repr_html_ > - > > Key: SPARK-41717 > URL: https://issues.apache.org/jira/browse/SPARK-41717 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > > We should deduplicate the implementation of print and _repr_html_ at > pyspark.sql.connect.plan.LogicalPlan -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41717) Implement the command logic for print and _repr_html_
[ https://issues.apache.org/jira/browse/SPARK-41717?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41717: Assignee: Apache Spark > Implement the command logic for print and _repr_html_ > - > > Key: SPARK-41717 > URL: https://issues.apache.org/jira/browse/SPARK-41717 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Assignee: Apache Spark >Priority: Major > > We should deduplicate the implementation of print and _repr_html_ at > pyspark.sql.connect.plan.LogicalPlan -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41717) Implement the command logic for print and _repr_html_
[ https://issues.apache.org/jira/browse/SPARK-41717?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652017#comment-17652017 ] Apache Spark commented on SPARK-41717: -- User 'HyukjinKwon' has created a pull request for this issue: https://github.com/apache/spark/pull/39223 > Implement the command logic for print and _repr_html_ > - > > Key: SPARK-41717 > URL: https://issues.apache.org/jira/browse/SPARK-41717 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.4.0 >Reporter: Hyukjin Kwon >Priority: Major > > We should deduplicate the implementation of print and _repr_html_ at > pyspark.sql.connect.plan.LogicalPlan -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41720) Rename UnresolvedFunc to UnresolvedFunctionName
[ https://issues.apache.org/jira/browse/SPARK-41720?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652016#comment-17652016 ] Apache Spark commented on SPARK-41720: -- User 'cloud-fan' has created a pull request for this issue: https://github.com/apache/spark/pull/39222 > Rename UnresolvedFunc to UnresolvedFunctionName > --- > > Key: SPARK-41720 > URL: https://issues.apache.org/jira/browse/SPARK-41720 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 3.4.0 >Reporter: Wenchen Fan >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41720) Rename UnresolvedFunc to UnresolvedFunctionName
[ https://issues.apache.org/jira/browse/SPARK-41720?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652015#comment-17652015 ] Apache Spark commented on SPARK-41720: -- User 'cloud-fan' has created a pull request for this issue: https://github.com/apache/spark/pull/39222 > Rename UnresolvedFunc to UnresolvedFunctionName > --- > > Key: SPARK-41720 > URL: https://issues.apache.org/jira/browse/SPARK-41720 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 3.4.0 >Reporter: Wenchen Fan >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41720) Rename UnresolvedFunc to UnresolvedFunctionName
[ https://issues.apache.org/jira/browse/SPARK-41720?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41720: Assignee: Apache Spark > Rename UnresolvedFunc to UnresolvedFunctionName > --- > > Key: SPARK-41720 > URL: https://issues.apache.org/jira/browse/SPARK-41720 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 3.4.0 >Reporter: Wenchen Fan >Assignee: Apache Spark >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41720) Rename UnresolvedFunc to UnresolvedFunctionName
[ https://issues.apache.org/jira/browse/SPARK-41720?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41720: Assignee: (was: Apache Spark) > Rename UnresolvedFunc to UnresolvedFunctionName > --- > > Key: SPARK-41720 > URL: https://issues.apache.org/jira/browse/SPARK-41720 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 3.4.0 >Reporter: Wenchen Fan >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41719) Spark SSLOptions sub settings should be set only when ssl is enabled
[ https://issues.apache.org/jira/browse/SPARK-41719?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41719: Assignee: (was: Apache Spark) > Spark SSLOptions sub settings should be set only when ssl is enabled > > > Key: SPARK-41719 > URL: https://issues.apache.org/jira/browse/SPARK-41719 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.2.4 >Reporter: Shrikant Prasad >Priority: Major > > If ${ns}.enabled is false, there is no use of setting rest of ${ns}.* > settings in SSLOptions as this requires unnecessary operations to be > performed to set these properties. > for ex: ${ns} - spark.ssl > As per SSLOptions, > * SSLOptions is intended to provide the maximum common set of SSL settings, > which are supported > * by the protocol, which it can generate the configuration for. > * > * @param enabled enables or disables SSL; *if it is set to false, the rest > of the settings are disregarded* -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-41719) Spark SSLOptions sub settings should be set only when ssl is enabled
[ https://issues.apache.org/jira/browse/SPARK-41719?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-41719: Assignee: Apache Spark > Spark SSLOptions sub settings should be set only when ssl is enabled > > > Key: SPARK-41719 > URL: https://issues.apache.org/jira/browse/SPARK-41719 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.2.4 >Reporter: Shrikant Prasad >Assignee: Apache Spark >Priority: Major > > If ${ns}.enabled is false, there is no use of setting rest of ${ns}.* > settings in SSLOptions as this requires unnecessary operations to be > performed to set these properties. > for ex: ${ns} - spark.ssl > As per SSLOptions, > * SSLOptions is intended to provide the maximum common set of SSL settings, > which are supported > * by the protocol, which it can generate the configuration for. > * > * @param enabled enables or disables SSL; *if it is set to false, the rest > of the settings are disregarded* -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41719) Spark SSLOptions sub settings should be set only when ssl is enabled
[ https://issues.apache.org/jira/browse/SPARK-41719?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17652014#comment-17652014 ] Apache Spark commented on SPARK-41719: -- User 'shrprasa' has created a pull request for this issue: https://github.com/apache/spark/pull/39221 > Spark SSLOptions sub settings should be set only when ssl is enabled > > > Key: SPARK-41719 > URL: https://issues.apache.org/jira/browse/SPARK-41719 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.2.4 >Reporter: Shrikant Prasad >Priority: Major > > If ${ns}.enabled is false, there is no use of setting rest of ${ns}.* > settings in SSLOptions as this requires unnecessary operations to be > performed to set these properties. > for ex: ${ns} - spark.ssl > As per SSLOptions, > * SSLOptions is intended to provide the maximum common set of SSL settings, > which are supported > * by the protocol, which it can generate the configuration for. > * > * @param enabled enables or disables SSL; *if it is set to false, the rest > of the settings are disregarded* -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41720) Rename UnresolvedFunc to UnresolvedFunctionName
Wenchen Fan created SPARK-41720: --- Summary: Rename UnresolvedFunc to UnresolvedFunctionName Key: SPARK-41720 URL: https://issues.apache.org/jira/browse/SPARK-41720 Project: Spark Issue Type: Improvement Components: SQL Affects Versions: 3.4.0 Reporter: Wenchen Fan -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-41719) Spark SSLOptions sub settings should be set only when ssl is enabled
[ https://issues.apache.org/jira/browse/SPARK-41719?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shrikant Prasad updated SPARK-41719: Description: If ${ns}.enabled is false, there is no use of setting rest of ${ns}.* settings in SSLOptions as this requires unnecessary operations to be performed to set these properties. for ex: ${ns} - spark.ssl As per SSLOptions, * SSLOptions is intended to provide the maximum common set of SSL settings, which are supported * by the protocol, which it can generate the configuration for. * * @param enabled enables or disables SSL; *if it is set to false, the rest of the settings are disregarded* was: If ${ns}.enabled is false, there is no use of setting rest of ${ns}.* settings in SSLOptions as this requires unnecessary operations to be performed to set these properties. As per SSLOptions, * SSLOptions is intended to provide the maximum common set of SSL settings, which are supported * by the protocol, which it can generate the configuration for. * * @param enabled enables or disables SSL; *if it is set to false, the rest of the settings are disregarded* > Spark SSLOptions sub settings should be set only when ssl is enabled > > > Key: SPARK-41719 > URL: https://issues.apache.org/jira/browse/SPARK-41719 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.2.4 >Reporter: Shrikant Prasad >Priority: Major > > If ${ns}.enabled is false, there is no use of setting rest of ${ns}.* > settings in SSLOptions as this requires unnecessary operations to be > performed to set these properties. > for ex: ${ns} - spark.ssl > As per SSLOptions, > * SSLOptions is intended to provide the maximum common set of SSL settings, > which are supported > * by the protocol, which it can generate the configuration for. > * > * @param enabled enables or disables SSL; *if it is set to false, the rest > of the settings are disregarded* -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-41719) Spark SSLOptions sub settings should be set only when ssl is enabled
[ https://issues.apache.org/jira/browse/SPARK-41719?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shrikant Prasad updated SPARK-41719: Summary: Spark SSLOptions sub settings should be set only when ssl is enabled (was: Spark SSL Options should be set only when ssl is enabled) > Spark SSLOptions sub settings should be set only when ssl is enabled > > > Key: SPARK-41719 > URL: https://issues.apache.org/jira/browse/SPARK-41719 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.2.4 >Reporter: Shrikant Prasad >Priority: Major > > If ${ns}.enabled is false, there is no use of setting rest of ${ns}.* > settings in SSLOptions as this requires unnecessary operations to be > performed to set these properties. > As per SSLOptions, > * SSLOptions is intended to provide the maximum common set of SSL settings, > which are supported > * by the protocol, which it can generate the configuration for. > * > * @param enabled enables or disables SSL; *if it is set to false, the rest > of the settings are disregarded* -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org