[ 
https://issues.apache.org/jira/browse/SPARK-37077?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maciej Szymkiewicz reassigned SPARK-37077:
------------------------------------------

    Assignee: Maciej Szymkiewicz

> Annotations for pyspark.sql.context.SQLContext.createDataFrame are broken
> -------------------------------------------------------------------------
>
>                 Key: SPARK-37077
>                 URL: https://issues.apache.org/jira/browse/SPARK-37077
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 3.3.0
>            Reporter: Maciej Szymkiewicz
>            Assignee: Maciej Szymkiewicz
>            Priority: Major
>
> During migration from stubs to inline annotations, variants taking {{RDD}} 
> where incorrectly remove. As a result
>  
> {code:python}
> from pyspark.sql import SQLContext, SparkSession
> from pyspark import SparkContext
> sc = SparkContext.getOrCreate()
> sqlContext= SQLContext(sc)
> sqlContext.createDataFrame(sc.parallelize([(1, 2)]))
> {code}
> although valid, no longer type checks
> {code}
> main.py:7: error: No overload variant of "createDataFrame" of "SQLContext" 
> matches argument type "RDD[Tuple[int, int]]"
> main.py:7: note: Possible overload variants:
> main.py:7: note:     def [RowLike in (List[Any], Tuple[Any, ...], Row)] 
> createDataFrame(self, data: Iterable[RowLike], samplingRatio: Optional[float] 
> = ...) -> DataFrame
> main.py:7: note:     def [RowLike in (List[Any], Tuple[Any, ...], Row)] 
> createDataFrame(self, data: Iterable[RowLike], schema: Union[List[str], 
> Tuple[str, ...]] = ..., verifySchema: bool = ...) -> DataFrame
> main.py:7: note:     def createDataFrame(self, data: DataFrameLike, 
> samplingRatio: Optional[float] = ...) -> DataFrame
> main.py:7: note:     <3 more non-matching overloads not shown>
> Found 1 error in 1 file (checked 1 source file)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to