HyukjinKwon commented on code in PR #46306: URL: https://github.com/apache/spark/pull/46306#discussion_r1585758081
########## python/pyspark/sql/tests/test_python_streaming_datasource.py: ########## @@ -150,6 +178,20 @@ def check_batch(df, batch_id): q.awaitTermination self.assertIsNone(q.exception(), "No exception has to be propagated.") + def test_simple_stream_reader(self): + self.spark.dataSource.register(self._get_simple_data_source()) + df = self.spark.readStream.format("SimpleDataSource").load() + + def check_batch(df, batch_id): + assertDataFrameEqual(df, [Row(batch_id * 2), Row(batch_id * 2 + 1)]) + + q = df.writeStream.foreachBatch(check_batch).start() + while len(q.recentProgress) < 10: + time.sleep(0.2) + q.stop() + q.awaitTermination + self.assertIsNone(q.exception(), "No exception has to be propagated.") + def test_stream_writer(self): input_dir = tempfile.TemporaryDirectory(prefix="test_data_stream_write_input") Review Comment: let's add cleanup ```python input_dir.cleanup() output_dir.cleanup() checkpoint_dir.cleanup() ``` those into `finally`. Feel free to do it in a separate PR. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org