[ 
https://issues.apache.org/jira/browse/SPARK-36845?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17422989#comment-17422989
 ] 

Maciej Szymkiewicz commented on SPARK-36845:
--------------------------------------------

It is an optional library and won't be  present by default.

As an official PSF project without transitive dependencies it might be 
acceptable as an installation dependency for PySpark, but it still leaves 
{{pyspark}} shell.

It might be possible to use {{TYPE_CHECKING}} blocks and {{__future__ 
annotations}} like this:

{code:python}
# _typing.py
from typing import TYPE_CHECKING

if TYPE_CHECKING:
    from typing import Protocol

    class P(Protocol): ...
{code}

{code:python}

from __future__ import annotations

from typing import TYPE_CHECKING

if TYPE_CHECKING:
    from _typing import P


def f(p: P):
    pass
{code}

but it requires further testing and might be not be worth it after all.




> Inline type hint files
> ----------------------
>
>                 Key: SPARK-36845
>                 URL: https://issues.apache.org/jira/browse/SPARK-36845
>             Project: Spark
>          Issue Type: Umbrella
>          Components: PySpark, SQL
>    Affects Versions: 3.3.0
>            Reporter: Takuya Ueshin
>            Priority: Major
>
> Currently there are type hint stub files ({{*.pyi}}) to show the expected 
> types for functions, but we can also take advantage of static type checking 
> within the functions by inlining the type hints.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to