[ 
https://issues.apache.org/jira/browse/SPARK-34943?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17414585#comment-17414585
 ] 

Shane Knapp commented on SPARK-34943:
-------------------------------------

done:

 
{noformat}
parallel-ssh -h ubuntu_workers.txt -i 
'/home/jenkins/anaconda2/envs/py36/bin/python -c "import flake8; 
print(flake8.__version__)"'
[1] 13:58:53 [SUCCESS] research-jenkins-worker-03
3.8.0
[2] 13:58:53 [SUCCESS] research-jenkins-worker-02
3.8.0
[3] 13:58:53 [SUCCESS] research-jenkins-worker-06
3.8.0
[4] 13:58:53 [SUCCESS] research-jenkins-worker-07
3.8.0
[5] 13:58:53 [SUCCESS] research-jenkins-worker-05
3.8.0
[6] 13:58:53 [SUCCESS] research-jenkins-worker-04
3.8.0
[7] 13:58:53 [SUCCESS] research-jenkins-worker-01
3.8.0
[8] 13:58:54 [SUCCESS] research-jenkins-worker-08
3.8.0{noformat}

> Upgrade flake8 to 3.8.0 or above in Jenkins
> -------------------------------------------
>
>                 Key: SPARK-34943
>                 URL: https://issues.apache.org/jira/browse/SPARK-34943
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 3.2.0
>            Reporter: Haejoon Lee
>            Assignee: Shane Knapp
>            Priority: Major
>
> In flake8 < 3.8.0, F401 error occurs for imports in *if* statements when 
> TYPE_CHECKING is True. However, TYPE_CHECKING is always False at runtime, so 
> there is no need to treat it as an error in static analysis.
> Since this behavior is fixed In flake8 >= 3.8.0, we should upgrade the flake8 
> installed in Jenkins to 3.8.0 or above. Otherwise, it occurs F401 error for 
> several lines in pandas-on-PySpark that uses TYPE_CHECKING.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.5.0 to 3.8.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to