Interesting, does anyone know if we'll be seeing the JDBC sinks in upcoming
releases?

Thanks!

Gary Lucas

On 9 April 2017 at 13:52, Silvio Fiorito <silvio.fior...@granturing.com>
wrote:

> JDBC sink is not in 2.1. You can see here for an example implementation
> using the ForEachWriter sink instead: https://databricks.com/blog/20
> 17/04/04/real-time-end-to-end-integration-with-apache-kafka-
> in-apache-sparks-structured-streaming.html
>
>
>
>
>
> *From: *Hemanth Gudela <hemanth.gud...@qvantel.com>
> *Date: *Sunday, April 9, 2017 at 4:30 PM
> *To: *"user@spark.apache.org" <user@spark.apache.org>
> *Subject: *Does spark 2.1.0 structured streaming support jdbc sink?
>
>
>
> Hello Everyone,
>
>                 I am new to Spark, especially spark streaming.
>
>
>
> I am trying to read an input stream from Kafka, perform windowed
> aggregations in spark using structured streaming, and finally write
> aggregates to a sink.
>
> -          MySQL as an output sink doesn’t seem to be an option, because
> this block of code throws an error
>
> streamingDF.writeStream.format("jdbc").start("jdbc:mysql…”)
>
> *ava.lang.UnsupportedOperationException*: Data source jdbc does not
> support streamed writing
>
> This is strange because, this
> <http://rxin.github.io/talks/2016-02-18_spark_summit_streaming.pdf>
> document shows that jdbc is supported as an output sink!
>
>
>
> -          Parquet doesn’t seem to be an option, because it doesn’t
> support “complete” output mode, but “append” only. As I’m preforming
> windows aggregations in spark streaming, the output mode has to be
> complete, and cannot be “append”
>
>
>
> -          Memory and console sinks are good for debugging, but are not
> suitable for production jobs.
>
>
>
> So, please correct me if I’m missing something in my code to enable jdbc
> output sink.
>
> If jdbc output sink is not option, please suggest me an alternative output
> sink that suits my needs better.
>
>
>
> Or since structured streaming is still ‘alpha’, should I resort to spark
> dstreams to achieve my use case described above.
>
> Please suggest.
>
>
>
> Thanks in advance,
>
> Hemanth
>

Reply via email to