rangadi commented on code in PR #41026:
URL: https://github.com/apache/spark/pull/41026#discussion_r1194147496


##########
sql/core/src/main/scala/org/apache/spark/sql/streaming/DataStreamWriter.scala:
##########
@@ -455,6 +465,16 @@ final class DataStreamWriter[T] private[sql](ds: 
Dataset[T]) {
     this
   }
 
+  private[sql] def foreachConnect(writer: PythonForeachWriter): 
DataStreamWriter[T] = {

Review Comment:
   Will this work for Scala connect too? Otherwise earlier name `foreachPython` 
sounds better.



##########
sql/core/src/main/scala/org/apache/spark/sql/streaming/DataStreamWriter.scala:
##########
@@ -534,6 +552,8 @@ final class DataStreamWriter[T] private[sql](ds: 
Dataset[T]) {
 
   private var foreachWriter: ForeachWriter[T] = null
 
+  private var pythonForeachWriter: PythonForeachWriter = null
+

Review Comment:
   If we can't do that, we can go with so solution. 
   In that case, I think we can simplify the implementation at line 360. I will 
comment there. 



##########
sql/core/src/main/scala/org/apache/spark/sql/streaming/DataStreamWriter.scala:
##########
@@ -534,6 +554,8 @@ final class DataStreamWriter[T] private[sql](ds: 
Dataset[T]) {
 
   private var foreachWriter: ForeachWriter[T] = null
 
+  private var connectForeachWriter: PythonForeachWriter = null

Review Comment:
   what if we declare `foreachWriter` as ForeachWriter[_]? 
   I think we can remove `connectForeachWriter`. 
   At line 360, we could have `val sink = ForewachWriterTable[_](...)`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to