[ https://issues.apache.org/jira/browse/SPARK-27612?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16831623#comment-16831623 ]
Hyukjin Kwon commented on SPARK-27612: -------------------------------------- Argh, this happens after we upgraded the cloudpickle to 0.6.2 https://github.com/apache/spark/commit/75ea89ad94ca76646e4697cf98c78d14c6e2695f#diff-19fd865e0dd0d7e6b04b3b1e047dcda7 Upgrading cloudpickle to 0.8.1 still doesn't solve the problem .. I think we should fix it in cloudpickle, I made a cloudpickle release and we port that change into Spark. > Creating a DataFrame in PySpark with ArrayType produces some Rows with Arrays > of None > ------------------------------------------------------------------------------------- > > Key: SPARK-27612 > URL: https://issues.apache.org/jira/browse/SPARK-27612 > Project: Spark > Issue Type: Bug > Components: PySpark, SQL > Affects Versions: 3.0.0 > Reporter: Bryan Cutler > Priority: Critical > Labels: correctness > > This seems to only affect Python 3. > When creating a DataFrame with type {{ArrayType(IntegerType(), True)}} there > ends up being rows that are filled with None. > > {code:java} > In [1]: from pyspark.sql.types import ArrayType, IntegerType > > In [2]: df = spark.createDataFrame([[1, 2, 3, 4]] * 100, > ArrayType(IntegerType(), True)) > In [3]: df.distinct().collect() > > Out[3]: [Row(value=[None, None, None, None]), Row(value=[1, 2, 3, 4])] > {code} > > From this example, it is consistently at elements 97, 98: > {code} > In [5]: df.collect()[-5:] > > Out[5]: > [Row(value=[1, 2, 3, 4]), > Row(value=[1, 2, 3, 4]), > Row(value=[None, None, None, None]), > Row(value=[None, None, None, None]), > Row(value=[1, 2, 3, 4])] > {code} > This also happens with a type of {{ArrayType(ArrayType(IntegerType(), True))}} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org