I'm a pretty new user of spark and I've run into this issue with the pyspark docs:
The functions pyspark.sql.functions.to_date && pyspark.sql.functions.to_timestamp behave in the same way. As in both functions convert a Column of pyspark.sql.types.StringType or pyspark.sql.types.TimestampType into pyspark.sql.types.DateType. Shouldn't the function `to_timestmap` return pyspark.sql.types.TimestampType? Also the to_timestamp docs say that "By default, it follows casting rules to pyspark.sql.types.TimestampType if the format is omitted (equivalent to col.cast("timestamp")). ", which doesn't seem to be right ie: to_timestamp(current_timestamp()) <> current_timestamp().cast("timestamp") This is wrong right? or am I missing something? (is this due to the underlying jvm data types?) Cheers, alan