[ 
https://issues.apache.org/jira/browse/SPARK-46738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

BingKun Pan updated SPARK-46738:
--------------------------------
    Description: 
The following doctest will throw an error in the tests of the pyspark-connect 
module
{code:java}
Example 5: Decrypt data with key.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([(
...     "83F16B2AA704794132802D248E6BFD4E380078182D1544813898AC97E709B28A94",
...     "0000111122223333",)],
...     ["input", "key"]
... )
>>> df.select(sf.try_aes_decrypt(
...     sf.unhex(df.input), df.key
... ).cast("STRING")).show(truncate=False) # doctest: +SKIP
+------------------------------------------------------------------+
|CAST(try_aes_decrypt(unhex(input), key, GCM, DEFAULT, ) AS STRING)|
+------------------------------------------------------------------+
|Spark                                                             |
+------------------------------------------------------------------+ {code}
{code:java}
df.select(sf.try_aes_decrypt(
4170        sf.unhex(df.input), df.key
4171    ).cast("STRING")).show(truncate=False)
4172Expected:
4173    +------------------------------------------------------------------+
4174    |CAST(try_aes_decrypt(unhex(input), key, GCM, DEFAULT, ) AS STRING)|
4175    +------------------------------------------------------------------+
4176    |Spark                                                             |
4177    +------------------------------------------------------------------+
4178Got:
4179    +--------------------------------------------------+
4180    |try_aes_decrypt(unhex(input), key, GCM, DEFAULT, )|
4181    +--------------------------------------------------+
4182    |Spark                                             |
4183    +--------------------------------------------------+{code}
!screenshot-1.png|width=671,height=222!

  was:
The following doctest will throw an error in the tests of the pyspark-connect 
module
{code:java}
Example 5: Decrypt data with key.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([(
...     "83F16B2AA704794132802D248E6BFD4E380078182D1544813898AC97E709B28A94",
...     "0000111122223333",)],
...     ["input", "key"]
... )
>>> df.select(sf.try_aes_decrypt(
...     sf.unhex(df.input), df.key
... ).cast("STRING")).show(truncate=False) # doctest: +SKIP
+------------------------------------------------------------------+
|CAST(try_aes_decrypt(unhex(input), key, GCM, DEFAULT, ) AS STRING)|
+------------------------------------------------------------------+
|Spark                                                             |
+------------------------------------------------------------------+ {code}
{code}
df.select(sf.try_aes_decrypt(
4170        sf.unhex(df.input), df.key
4171    ).cast("STRING")).show(truncate=False)
4172Expected:
4173    +------------------------------------------------------------------+
4174    |CAST(try_aes_decrypt(unhex(input), key, GCM, DEFAULT, ) AS STRING)|
4175    +------------------------------------------------------------------+
4176    |Spark                                                             |
4177    +------------------------------------------------------------------+
4178Got:
4179    +--------------------------------------------------+
4180    |try_aes_decrypt(unhex(input), key, GCM, DEFAULT, )|
4181    +--------------------------------------------------+
4182    |Spark                                             |
4183    +--------------------------------------------------+{code}


> `Cast` of pyspark displayed different results between Regular Spark and Spark 
> Connect
> -------------------------------------------------------------------------------------
>
>                 Key: SPARK-46738
>                 URL: https://issues.apache.org/jira/browse/SPARK-46738
>             Project: Spark
>          Issue Type: Bug
>          Components: Connect, PySpark
>    Affects Versions: 4.0.0
>            Reporter: BingKun Pan
>            Priority: Minor
>         Attachments: screenshot-1.png
>
>
> The following doctest will throw an error in the tests of the pyspark-connect 
> module
> {code:java}
> Example 5: Decrypt data with key.
> >>> import pyspark.sql.functions as sf
> >>> df = spark.createDataFrame([(
> ...     "83F16B2AA704794132802D248E6BFD4E380078182D1544813898AC97E709B28A94",
> ...     "0000111122223333",)],
> ...     ["input", "key"]
> ... )
> >>> df.select(sf.try_aes_decrypt(
> ...     sf.unhex(df.input), df.key
> ... ).cast("STRING")).show(truncate=False) # doctest: +SKIP
> +------------------------------------------------------------------+
> |CAST(try_aes_decrypt(unhex(input), key, GCM, DEFAULT, ) AS STRING)|
> +------------------------------------------------------------------+
> |Spark                                                             |
> +------------------------------------------------------------------+ {code}
> {code:java}
> df.select(sf.try_aes_decrypt(
> 4170        sf.unhex(df.input), df.key
> 4171    ).cast("STRING")).show(truncate=False)
> 4172Expected:
> 4173    +------------------------------------------------------------------+
> 4174    |CAST(try_aes_decrypt(unhex(input), key, GCM, DEFAULT, ) AS STRING)|
> 4175    +------------------------------------------------------------------+
> 4176    |Spark                                                             |
> 4177    +------------------------------------------------------------------+
> 4178Got:
> 4179    +--------------------------------------------------+
> 4180    |try_aes_decrypt(unhex(input), key, GCM, DEFAULT, )|
> 4181    +--------------------------------------------------+
> 4182    |Spark                                             |
> 4183    +--------------------------------------------------+{code}
> !screenshot-1.png|width=671,height=222!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to