GitHub user dongjoon-hyun opened a pull request:

    https://github.com/apache/spark/pull/22649

    [SPARK-25644][SS][FOLLOWUP][BUILD] Fix Scala 2.12 build error due to 
foreachBatch

    ## What changes were proposed in this pull request?
    
    This PR fixes the Scala-2.12 build error due to ambiguity in `foreachBatch` 
test cases.
    - 
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/428/console
    ```scala
    [error] 
/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/sources/ForeachBatchSinkSuite.scala:102:
 ambiguous reference to overloaded definition,
    [error] both method foreachBatch in class DataStreamWriter of type 
(function: 
org.apache.spark.api.java.function.VoidFunction2[org.apache.spark.sql.Dataset[Int],Long])org.apache.spark.sql.streaming.DataStreamWriter[Int]
    [error] and  method foreachBatch in class DataStreamWriter of type 
(function: (org.apache.spark.sql.Dataset[Int], Long) => 
Unit)org.apache.spark.sql.streaming.DataStreamWriter[Int]
    [error] match argument types ((org.apache.spark.sql.Dataset[Int], Any) => 
Unit)
    [error]       ds.writeStream.foreachBatch((_, _) => 
{}).trigger(Trigger.Continuous("1 second")).start()
    [error]                      ^
    [error] 
/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/sources/ForeachBatchSinkSuite.scala:106:
 ambiguous reference to overloaded definition,
    [error] both method foreachBatch in class DataStreamWriter of type 
(function: 
org.apache.spark.api.java.function.VoidFunction2[org.apache.spark.sql.Dataset[Int],Long])org.apache.spark.sql.streaming.DataStreamWriter[Int]
    [error] and  method foreachBatch in class DataStreamWriter of type 
(function: (org.apache.spark.sql.Dataset[Int], Long) => 
Unit)org.apache.spark.sql.streaming.DataStreamWriter[Int]
    [error] match argument types ((org.apache.spark.sql.Dataset[Int], Any) => 
Unit)
    [error]       ds.writeStream.foreachBatch((_, _) => 
{}).partitionBy("value").start()
    [error]                      ^
    ```
    
    ## How was this patch tested?
    
    Manual.
    
    Since this failure occurs in Scala-2.12 profile and test cases, Jenkins 
will not test this. We need to build with Scala-2.12 and run the tests.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/dongjoon-hyun/spark SPARK-SCALA212

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/22649.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #22649
    
----
commit 5e0f6fc14cd468ae1d06ab40e53189fb292375c0
Author: Dongjoon Hyun <dongjoon@...>
Date:   2018-10-06T04:06:23Z

    [SPARK-25644][SS][FOLLOWUP][BUILD] Fix Scala 2.12 build error due to 
foreachBatch

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to