[ https://issues.apache.org/jira/browse/SPARK-37187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-37187: --------------------------------- Parent: SPARK-37197 Issue Type: Sub-task (was: Bug) > pyspark.pandas fails to create a histogram of one column from a large > DataFrame > ------------------------------------------------------------------------------- > > Key: SPARK-37187 > URL: https://issues.apache.org/jira/browse/SPARK-37187 > Project: Spark > Issue Type: Sub-task > Components: PySpark > Affects Versions: 3.2.0 > Reporter: Chuck Connell > Priority: Major > > > When trying to create a histogram from one column of a large DataFrame, > pyspark.pandas fails. So this line > {quote}DF.plot.hist(column="FullVaxPer100", bins=20) # there are many other > columns > {quote} > yields this error > {quote}cannot resolve 'least(min(EndDate), min(EndDeaths), > min(`STATE-COUNTY`), min(StartDate), min(StartDeaths), min(POPESTIMATE2020), > min(ST_ABBR), min(VaxStartDate), min(Series_Complete_Yes_Start), > min(Administered_Dose1_Recip_Start), min(VaxEndDate), > min(Series_Complete_Yes_End), min(Administered_Dose1_Recip_End), min(Deaths), > min(Series_Complete_Yes_Mid), min(Administered_Dose1_Recip_Mid), > min(FullVaxPer100), min(OnePlusVaxPer100), min(DeathsPer100k))' due to data > type mismatch: The expressions should all have the same type, got > LEAST(timestamp, bigint, string, timestamp, bigint, bigint, string, > timestamp, bigint, bigint, timestamp, bigint, bigint, bigint, double, double, > double, double, double).; > {quote} > The odd thing is that pyspark.pandas seems to be operating on all the columns > when only one is needed. > As a workaround, you can first create a one-column DataFrame that selects > just the field you want, then make a histogram of that. But the command above > should work also. > I can supply the complete program and datasets that demonstrate the error. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org