[ 
https://issues.apache.org/jira/browse/SPARK-10633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-10633.
-------------------------------
    Resolution: Not A Problem

> Persisting Spark stream to MySQL - Spark tries to create the table for every 
> stream even if it exist already.
> -------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10633
>                 URL: https://issues.apache.org/jira/browse/SPARK-10633
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL, Streaming
>    Affects Versions: 1.4.0, 1.5.0
>         Environment: Ubuntu 14.04
> IntelliJ IDEA 14.1.4
> sbt
> mysql-connector-java 5.1.35 (Tested and working with Spark 1.3.1)
>            Reporter: Lunen
>
> Persisting Spark Kafka stream to MySQL 
> Spark 1.4 + tries to create a table automatically every time the stream gets 
> sent to a specified table.
> Please note, Spark 1.3.1 works.
> Code sample:
> val url = "jdbc:mysql://host:port/db?user=user&password=password
>     val crp = RowSetProvider.newFactory()
>     val crsSql: CachedRowSet = crp.createCachedRowSet()
>     val crsTrg: CachedRowSet = crp.createCachedRowSet()
>     crsSql.beforeFirst()
>     crsTrg.beforeFirst()
>     //Read Stream from Kafka
>     //Produce SQL INSERT STRING
>     
>     streamT.foreachRDD { rdd =>
>       if (rdd.toLocalIterator.nonEmpty) {    
>         sqlContext.read.json(rdd).registerTempTable(serverEvents + "_events")
>         while (crsSql.next) {
>           sqlContext.sql("SQL INSERT STRING").write.jdbc(url, "SCHEMA_NAME", 
> new Properties)
>           println("Persisted Data: " + 'SQL INSERT STRING')
>         }
>         crsSql.beforeFirst()
>       }
>       stmt.close()
>       conn.close()
>     }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to