WweiL commented on code in PR #41026:
URL: https://github.com/apache/spark/pull/41026#discussion_r1185316680


##########
sql/core/src/main/scala/org/apache/spark/sql/streaming/DataStreamWriter.scala:
##########
@@ -354,7 +357,12 @@ final class DataStreamWriter[T] private[sql](ds: 
Dataset[T]) {
       query
     } else if (source == SOURCE_NAME_FOREACH) {
       assertNotPartitioned(SOURCE_NAME_FOREACH)
-      val sink = ForeachWriterTable[T](foreachWriter, ds.exprEnc)
+      val sink = if (foreachWriter != null) {
+        ForeachWriterTable[T](foreachWriter, ds.exprEnc)
+      } else {
+        new ForeachWriterTable[UnsafeRow](
+          pythonForeachWriter, Right((x: InternalRow) => 
x.asInstanceOf[UnsafeRow]))
+      }

Review Comment:
   @rangadi Doing this would result into the following error:
   ```
   [error] 
/__w/oss-spark/oss-spark/sql/core/src/main/scala/org/apache/spark/sql/streaming/DataStreamWriter.scala:360:18:
 inferred existential type 
org.apache.spark.sql.execution.streaming.sources.ForeachWriterTable[_1]( 
forSome { type _1 >: _$1 with 
org.apache.spark.sql.catalyst.expressions.UnsafeRow; type _$1 }), which cannot 
be expressed by wildcards,  should be enabled
   [error] by making the implicit value scala.language.existentials visible.
   [error] ----
   [error] This can be achieved by adding the import clause 'import 
scala.language.existentials'
   [error] or by setting the compiler option -language:existentials.
   [error] See the Scaladoc for value scala.language.existentials for a 
discussion
   [error] why the feature should be explicitly enabled.
   [error]       val sink = if (foreachWriter != null) {
   [error]                  ^
   [error] one error found
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to