[ 
https://issues.apache.org/jira/browse/FLINK-6281?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16119806#comment-16119806
 ] 

ASF GitHub Bot commented on FLINK-6281:
---------------------------------------

Github user fhueske commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3712#discussion_r132160738
  
    --- Diff: docs/dev/table/sourceSinks.md ---
    @@ -202,7 +202,38 @@ val csvTableSource = CsvTableSource
     Provided TableSinks
     -------------------
     
    -**TODO**
    +### JDBCAppendSink
    +
    +<code>JDBCAppendSink</code> allows you to bridge the data stream to the 
JDBC driver. The sink only supports append-only data. It does not support 
retractions and upserts from Flink's perspectives. However, you can customize 
the query using <code>REPLACE</code> or <code>INSERT OVERWRITE</code> to 
implement upsert inside the database.
    +
    +To use the JDBC sink, you have to add the JDBC connector dependency 
(<code>flink-jdbc</code>) to your project. Then you can create the sink using 
<code>JDBCAppendSinkBuilder</code>:
    +
    +<div class="codetabs" markdown="1">
    +<div data-lang="java" markdown="1">
    +{% highlight java %}
    +
    +JDBCAppendTableSink sink = JDBCAppendTableSink.builder()
    +  .setDrivername("org.apache.derby.jdbc.EmbeddedDriver")
    +  .setDBUrl("jdbc:derby:memory:ebookshop")
    +  .setQuery("INSERT INTO books (id) VALUES (?)")
    +  .setFieldTypes(new TypeInformation<?>[] {INT_TYPE_INFO})
    +  .build();
    +{% endhighlight %}
    +</div>
    +
    +<div data-lang="scala" markdown="1">
    +{% highlight scala %}
    +val sink = JDBCAppendTableSink.builder()
    +  .setDrivername("org.apache.derby.jdbc.EmbeddedDriver")
    +  .setDBUrl("jdbc:derby:memory:ebookshop")
    +  .setQuery("INSERT INTO books (id) VALUES (?)")
    +  .setFieldTypes(Array(INT_TYPE_INFO))
    --- End diff --
    
    use varargs?


> Create TableSink for JDBC
> -------------------------
>
>                 Key: FLINK-6281
>                 URL: https://issues.apache.org/jira/browse/FLINK-6281
>             Project: Flink
>          Issue Type: Improvement
>          Components: Table API & SQL
>            Reporter: Haohui Mai
>            Assignee: Haohui Mai
>
> It would be nice to integrate the table APIs with the JDBC connectors so that 
> the rows in the tables can be directly pushed into JDBC.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to