This is interface is actually unstable. The v2 of DataSource APIs is being
designed right now which will be public and stable in a release or two. So
unfortunately there is no stable interface right now that I can officially
recommend.

That said, you could always use the ForeachWriter interface (see
DataStreamWriter.foreach).
Also, in the next release, you will also have a foreachBatch interface that
allows you to do custom operation on the output of each micro-batch
represented as a DataFrame (exactly same as the Sink.addBatch).
Both of these should be useful for you until the interfaces are stabilized.

On Mon, Jun 25, 2018 at 9:55 AM, subramgr <subramanian.gir...@gmail.com>
wrote:

> We are using Spark 2.3 and would want to know if it is recommended to
> create
> a custom KairoDBSink by implementing the StreamSinkProvider ?
>
> The interface is marked experimental and in-stable ?
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to