[ https://issues.apache.org/jira/browse/SPARK-27612?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16830945#comment-16830945 ]
Marco Gaido commented on SPARK-27612: ------------------------------------- I am not able to reproduce... {code} ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /__ / .__/\_,_/_/ /_/\_\ version 3.0.0-SNAPSHOT /_/ Using Python version 2.7.10 (default, Oct 6 2017 22:29:07) SparkSession available as 'spark'. >>> from pyspark.sql.types import ArrayType, IntegerType >>> df = spark.createDataFrame([[1, 2, 3, 4]] * 100, ArrayType(IntegerType(), >>> True)) >>> df.distinct().collect() [Row(value=[1, 2, 3, 4])] >>> {code} > Creating a DataFrame in PySpark with ArrayType produces some Rows with Arrays > of None > ------------------------------------------------------------------------------------- > > Key: SPARK-27612 > URL: https://issues.apache.org/jira/browse/SPARK-27612 > Project: Spark > Issue Type: Bug > Components: PySpark, SQL > Affects Versions: 3.0.0 > Reporter: Bryan Cutler > Priority: Major > > When creating a DataFrame with type {{ArrayType(IntegerType(), True)}} there > ends up being rows that are filled with None. > > {code:java} > In [1]: from pyspark.sql.types import ArrayType, IntegerType > > In [2]: df = spark.createDataFrame([[1, 2, 3, 4]] * 100, > ArrayType(IntegerType(), True)) > In [3]: df.distinct().collect() > > Out[3]: [Row(value=[None, None, None, None]), Row(value=[1, 2, 3, 4])] > {code} > > From this example, it is consistently at elements 97, 98: > {code:python} > In [5]: df.collect()[-5:] > > Out[5]: > [Row(value=[1, 2, 3, 4]), > Row(value=[1, 2, 3, 4]), > Row(value=[None, None, None, None]), > Row(value=[None, None, None, None]), > Row(value=[1, 2, 3, 4])] > {code} > This also happens with a type of {{ArrayType(ArrayType(IntegerType(), True))}} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org