[ 
https://issues.apache.org/jira/browse/SPARK-37683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17461886#comment-17461886
 ] 

Apache Spark commented on SPARK-37683:
--------------------------------------

User 'zero323' has created a pull request for this issue:
https://github.com/apache/spark/pull/34946

> Nondeterministic type checker error in data tests
> -------------------------------------------------
>
>                 Key: SPARK-37683
>                 URL: https://issues.apache.org/jira/browse/SPARK-37683
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.3.0
>            Reporter: Maciej Szymkiewicz
>            Priority: Major
>
> Occasionally, I see different errors around {{_jvm}}, {{_wrapped}} and 
> {{_jsparkSession}} popping out in data tests
> {code}
> E     ../../spark/python/pyspark/sql/catalog:71: error: Cannot determine type 
> of "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/catalog:72: error: Cannot determine type 
> of "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/catalog:349: error: Cannot determine 
> type of "_wrapped"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/catalog:360: error: Cannot determine 
> type of "_wrapped"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/column:920: error: Cannot determine type 
> of "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/pandas/conversion:400: error: Cannot 
> determine type of "_wrapped"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/pandas/conversion:407: error: Cannot 
> determine type of "_wrapped"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/pandas/conversion:413: error: Cannot 
> determine type of "_wrapped"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/pandas/conversion:602: error: Cannot 
> determine type of "_wrapped"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/pandas/conversion:604: error: Cannot 
> determine type of "_wrapped"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/pandas/conversion:620: error: Cannot 
> determine type of "_jvm"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/pandas/conversion:621: error: Cannot 
> determine type of "_jvm"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/pandas/conversion:622: error: Cannot 
> determine type of "_wrapped"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/readwriter:115: error: Cannot determine 
> type of "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/readwriter:307: error: Cannot determine 
> type of "_jvm"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/readwriter:308: error: Cannot determine 
> type of "_jvm"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/session:283: error: Cannot determine 
> type of "_jvm"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/session:284: error: Cannot determine 
> type of "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/streaming:377: error: Cannot determine 
> type of "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/udf:225: error: Cannot determine type of 
> "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/udf:456: error: Cannot determine type of 
> "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/udf:509: error: Cannot determine type of 
> "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/udf:510: error: Cannot determine type of 
> "_jsparkSession"  [has-type] (diff)
> E     ../../spark/python/pyspark/sql/udf:533: error: Cannot determine type of 
> "_jsparkSession"  [has-type] (diff)
> {code}
> The set of errors is not always the same, so it indicates so issues with type 
> checker.
> I am going to add {{casts}} and / or annotations for these lines, so we don't 
> get flaky tests.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to