[ 
https://issues.apache.org/jira/browse/SPARK-48068?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48068:
----------------------------------
    Description: 
We assumed that `PYTHON_EXECUTABLE` is used for `dev/lint-python` like the 
following. That's not true. We need to use `mypy`'s parameter to make it sure.

https://github.com/apache/spark/blob/ff401dde50343c9bbc1c49a0294272f2da7d01e2/.github/workflows/build_and_test.yml#L705

  was:
{code}
$ python3 --version
Python 3.10.13

$ dev/lint-python --mypy
starting mypy annotations test...
annotations failed mypy checks:
python/pyspark/sql/pandas/conversion.py:450: error: Unused "type: ignore" 
comment  [unused-ignore]
Found 1 error in 1 file (checked 1013 source files)
1
{code}

{code}
$ python3 --version
Python 3.11.8

$ dev/lint-python --mypy
starting mypy annotations test...
annotations failed mypy checks:
python/pyspark/sql/pandas/conversion.py:450: error: Unused "type: ignore" 
comment  [unused-ignore]
Found 1 error in 1 file (checked 1013 source files)
1
{code}


> Fix `mypy` failure in Python 3.10 and 3.11
> ------------------------------------------
>
>                 Key: SPARK-48068
>                 URL: https://issues.apache.org/jira/browse/SPARK-48068
>             Project: Spark
>          Issue Type: Sub-task
>          Components: PySpark
>    Affects Versions: 3.3.0, 4.0.0, 3.5.1, 3.3.4, 3.4.3
>            Reporter: Dongjoon Hyun
>            Assignee: Dongjoon Hyun
>            Priority: Major
>              Labels: pull-request-available
>
> We assumed that `PYTHON_EXECUTABLE` is used for `dev/lint-python` like the 
> following. That's not true. We need to use `mypy`'s parameter to make it sure.
> https://github.com/apache/spark/blob/ff401dde50343c9bbc1c49a0294272f2da7d01e2/.github/workflows/build_and_test.yml#L705



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to