[jira] [Updated] (SPARK-40309) Introduce sql_conf context manager for pyspark.sql

2022-09-02 Thread Xinrong Meng (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xinrong Meng updated SPARK-40309:
-
Description: 
That would simplify the control of Spark SQL configuration as below

from
{code:java}
original_value = spark.conf.get("key")
spark.conf.set("key", "value")
...
spark.conf.set("key", original_value){code}
to
{code:java}
with sql_conf({"key": "value"}):
...
{code}
[Here|https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]
 is such a context manager is in Pandas API on Spark.

We should introduce one in `pyspark.sql`, and deduplicate code if possible.

 

  was:
[Here|https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]
 a context manager is introduced to set the Spark SQL configuration and
then restores it back when it exits, in Pandas API on Spark.

That simplifies the control of Spark SQL configuration as below

from
{code:java}
original_value = spark.conf.get("key")
spark.conf.set("key", "value")
...
spark.conf.set("key", original_value){code}
to
{code:java}
with sql_conf({"key": "value"}):
...
{code}
We should introduce a similar context manager in `pyspark.sql`, and deduplicate 
code if possible.

 


> Introduce sql_conf context manager for pyspark.sql
> --
>
> Key: SPARK-40309
> URL: https://issues.apache.org/jira/browse/SPARK-40309
> Project: Spark
>  Issue Type: Sub-task
>  Components: PySpark
>Affects Versions: 3.4.0
>Reporter: Xinrong Meng
>Priority: Major
>  Labels: release-notes
>
> That would simplify the control of Spark SQL configuration as below
> from
> {code:java}
> original_value = spark.conf.get("key")
> spark.conf.set("key", "value")
> ...
> spark.conf.set("key", original_value){code}
> to
> {code:java}
> with sql_conf({"key": "value"}):
> ...
> {code}
> [Here|https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]
>  is such a context manager is in Pandas API on Spark.
> We should introduce one in `pyspark.sql`, and deduplicate code if possible.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-40309) Introduce sql_conf context manager for pyspark.sql

2022-09-02 Thread Xinrong Meng (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xinrong Meng updated SPARK-40309:
-
Description: 
[Here|https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]
 a context manager is introduced to set the Spark SQL configuration and
then restores it back when it exits, in Pandas API on Spark.

That simplifies the control of Spark SQL configuration as below

from
{code:java}
original_value = spark.conf.get("key")
spark.conf.set("key", "value")
...
spark.conf.set("key", original_value){code}
to
{code:java}
with sql_conf({"key": "value"}):
...
{code}
We should introduce a similar context manager in `pyspark.sql`, and deduplicate 
code if possible.

 

  was:
[https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]

a context manager is introduced to set the Spark SQL configuration and
then restores it back when it exits, in Pandas API on Spark.

That simplifies the control of Spark SQL configuration, 

from
{code:java}
original_value = spark.conf.get("key")
spark.conf.set("key", "value")
...
spark.conf.set("key", original_value){code}
to
{code:java}
with sql_conf({"key": "value"}):
...
{code}
 

 


> Introduce sql_conf context manager for pyspark.sql
> --
>
> Key: SPARK-40309
> URL: https://issues.apache.org/jira/browse/SPARK-40309
> Project: Spark
>  Issue Type: Sub-task
>  Components: PySpark
>Affects Versions: 3.4.0
>Reporter: Xinrong Meng
>Priority: Major
>  Labels: release-notes
>
> [Here|https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]
>  a context manager is introduced to set the Spark SQL configuration and
> then restores it back when it exits, in Pandas API on Spark.
> That simplifies the control of Spark SQL configuration as below
> from
> {code:java}
> original_value = spark.conf.get("key")
> spark.conf.set("key", "value")
> ...
> spark.conf.set("key", original_value){code}
> to
> {code:java}
> with sql_conf({"key": "value"}):
> ...
> {code}
> We should introduce a similar context manager in `pyspark.sql`, and 
> deduplicate code if possible.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-40309) Introduce sql_conf context manager for pyspark.sql

2022-09-01 Thread Xiao Li (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiao Li updated SPARK-40309:

Labels: release-notes  (was: )

> Introduce sql_conf context manager for pyspark.sql
> --
>
> Key: SPARK-40309
> URL: https://issues.apache.org/jira/browse/SPARK-40309
> Project: Spark
>  Issue Type: Sub-task
>  Components: PySpark
>Affects Versions: 3.4.0
>Reporter: Xinrong Meng
>Priority: Major
>  Labels: release-notes
>
> [https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]
> a context manager is introduced to set the Spark SQL configuration and
> then restores it back when it exits, in Pandas API on Spark.
> That simplifies the control of Spark SQL configuration, 
> from
> {code:java}
> original_value = spark.conf.get("key")
> spark.conf.set("key", "value")
> ...
> spark.conf.set("key", original_value){code}
> to
> {code:java}
> with sql_conf({"key": "value"}):
> ...
> {code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-40309) Introduce sql_conf context manager for pyspark.sql

2022-09-01 Thread Xinrong Meng (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xinrong Meng updated SPARK-40309:
-
Description: 
[https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]

a context manager is introduced to set the Spark SQL configuration and
then restores it back when it exits, in Pandas API on Spark.

That simplifies the control of Spark SQL configuration, 

from
{code:java}
original_value = spark.conf.get("key")
spark.conf.set("key", "value")
...
spark.conf.set("key", original_value){code}
to
{code:java}
with sql_conf({"key": "value"}):
...
{code}
 

 

  was:
[https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]

a context manager is introduced to set the Spark SQL configuration and
then restores it back when it exits, in Pandas API on Spark.

That simplifies the control of Spark SQL configuration, 

from
{code:java}
original_value = spark.conf.get("key")
spark.conf.set("key", "value")
...
spark.conf.set("key", original_value){code}
to

 

 
{code:java}
with sql_conf({"key": "value"}):
...
{code}
 

 


> Introduce sql_conf context manager for pyspark.sql
> --
>
> Key: SPARK-40309
> URL: https://issues.apache.org/jira/browse/SPARK-40309
> Project: Spark
>  Issue Type: Sub-task
>  Components: PySpark
>Affects Versions: 3.4.0
>Reporter: Xinrong Meng
>Priority: Major
>
> [https://github.com/apache/spark/blob/master/python/pyspark/pandas/utils.py#L490]
> a context manager is introduced to set the Spark SQL configuration and
> then restores it back when it exits, in Pandas API on Spark.
> That simplifies the control of Spark SQL configuration, 
> from
> {code:java}
> original_value = spark.conf.get("key")
> spark.conf.set("key", "value")
> ...
> spark.conf.set("key", original_value){code}
> to
> {code:java}
> with sql_conf({"key": "value"}):
> ...
> {code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org