[ https://issues.apache.org/jira/browse/SPARK-34941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17317846#comment-17317846 ]
Yikun Jiang edited comment on SPARK-34941 at 4/9/21, 10:12 AM: --------------------------------------------------------------- It seems no one take this task yet, I can help some on it. In order to easy to review and merge, I'm going to enable mypy test step by step: {{Step 1. First, enable all modules mypy one by one.}} such as after we fix all pandas.spark.* mypy problem, we enable the spark module: {{TODO(SPARK-34941): Enable mypy for pandas-on-Spark}} {{[mypy-pyspark.pandas.*]}} {{ignore_errors = True}} *{{[mypy-pyspark.pandas.spark.*]}}* *{{ignore_errors = False}}* {{That means we skip the }}{{pandas.* mypy check except *pandas.spark.**}}{{}} {{Step 2. After enable all mypy in every modules, we could just remove the [mypy-pyspark.pandas.*]}}{{}} was (Author: yikunkero): It seems no one take this task yet, I can help some on it. In order to easy to review and merge, I'm going to enable mypy test step by step: {{Step 1. First, enable all modules mypy one by one.}} {{ such as after we fix all pandas.spark.* mypy problem, we enable the spark module:}} {{TODO(SPARK-34941): Enable mypy for pandas-on-Spark}} {{[mypy-pyspark.pandas.*]}} {{ignore_errors = True}} *{{[mypy-pyspark.pandas.spark.*]}}* *{{ignore_errors = False}}* {{That means we skip the }}{{pandas.* mypy check except *pandas.spark.**}}{{}} {{Step 2. After enable all mypy in every modules, we could just remove the [mypy-pyspark.pandas.*]}}{{}} > Enable mypy for pandas-on-Spark > ------------------------------- > > Key: SPARK-34941 > URL: https://issues.apache.org/jira/browse/SPARK-34941 > Project: Spark > Issue Type: Sub-task > Components: PySpark > Affects Versions: 3.2.0 > Reporter: Haejoon Lee > Priority: Major > > This JIRA aims to enable mypy test for {{pyspark.pandas}} package. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org