Hi,

I suspect I may have come across a bug in the handling of data containing 
timestamps in PySpark "Structured Streaming" using the foreach option. I'm 
"just" a user of PySpark, no Spark community member, so I don't know how to 
properly address the issue. I have posted a 
question<https://stackoverflow.com/questions/74113270/how-to-handle-timestamp-data-in-pyspark-streaming-by-row>
 about this on StackOverflow but that didn't get any attention, yet. Could 
someone please have a look at it to check whether it is really a bug? In case a 
Jira ticket is created could you please send me the link?
Thanks and best regards
Kai Roesner.
Dr. Kai-Michael Roesner
Development Architect
Technology & Innovation, Common Data Services
SAP SE
Robert-Bosch-Strasse 30/34
69190 Walldorf, Germany
T +49 6227 7-64216
F +49 6227 78-28459
E kai-michael.roes...@sap.com<mailto:kai-michael.roes...@sap.com>
www.sap.com<http://www.sap.com/>

Please consider the impact on the environment before printing this e-mail.

Pflichtangaben/Mandatory Disclosure Statements:
www.sap.com/corporate-en/impressum<http://www.sap.com/corporate-en/impressum>

Diese E-Mail kann Betriebs- oder Geschäftsgeheimnisse oder sonstige 
vertrauliche Informationen enthalten. Sollten Sie diese E-Mail irrtümlich 
erhalten haben, ist Ihnen eine Kenntnisnahme des Inhalts, eine Vervielfältigung 
oder Weitergabe der E-Mail ausdrücklich untersagt. Bitte benachrichtigen Sie 
uns und vernichten Sie die empfangene E-Mail. Vielen Dank.

This e-mail may contain trade secrets or privileged, undisclosed, or otherwise 
confidential information. If you have received this e-mail in error, you are 
hereby notified that any review, copying, or distribution of it is strictly 
prohibited. Please inform us immediately and destroy the original transmittal. 
Thank you for your cooperation.

Reply via email to