[ 
https://issues.apache.org/jira/browse/SPARK-37011?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17433954#comment-17433954
 ] 

Shane Knapp commented on SPARK-37011:
-------------------------------------

from the test build 
(https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/144597/consoleFull):

 
{noformat}
starting python compilation test...
python compilation succeeded.

The python3 -m black command was not found. Skipping black checks for now.

downloading pycodestyle from 
https://raw.githubusercontent.com/PyCQA/pycodestyle/2.7.0/pycodestyle.py...
starting pycodestyle test...
pycodestyle checks passed.

starting flake8 test...
flake8 checks passed.

The mypy command was not found. Skipping for now.

all lint-python tests passed!{noformat}
 regardless of it passing or not, the flake8 version tested against passed.  
i'm going to close this and merge the PR.

 

 

 

> Upgrade flake8 to 3.9.0 or above in Jenkins
> -------------------------------------------
>
>                 Key: SPARK-37011
>                 URL: https://issues.apache.org/jira/browse/SPARK-37011
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 3.3.0
>            Reporter: Takuya Ueshin
>            Priority: Major
>
> In flake8 < 3.9.0, F401 error occurs for imports when the imported identities 
> are used in a {{bound}} argument in {{TypeVar(..., bound="XXX")}}.
> For example:
> {code:python}
> if TYPE_CHECKING:
>     from pyspark.pandas.base import IndexOpsMixin
> IndexOpsLike = TypeVar("IndexOpsLike", bound="IndexOpsMixin")
> {code}
> Since this behavior is fixed In flake8 >= 3.9.0, we should upgrade the flake8 
> installed in Jenkins to 3.9.0 or above.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.8.0 to 3.9.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to