Hi Ruslan,

Here is some sample code which writes a DataFrame to a table in a Derby
database:

import org.apache.spark.sql._
import org.apache.spark.sql.types._

val binaryVal = Array[Byte] ( 1, 2, 3, 4 )
val timestampVal = java.sql.Timestamp.valueOf("1996-01-01 03:30:36")
val dateVal = java.sql.Date.valueOf("1996-01-01")

val allTypes = sc.parallelize(
    Array(
      (1,
      1.toLong,
      1.toDouble,
      1.toFloat,
      1.toShort,
      1.toByte,
      "true".toBoolean,
      "one ring to rule them all",
      binaryVal,
      timestampVal,
      dateVal,
      BigDecimal.valueOf(42549.12)
      )
    )).toDF(
      "int_col",
      "long_col",
      "double_col",
      "float_col",
      "short_col",
      "byte_col",
      "boolean_col",
      "string_col",
      "binary_col",
      "timestamp_col",
      "date_col",
      "decimal_col"
      )

val properties = new java.util.Properties()

allTypes.write.jdbc("jdbc:derby:/Users/rhillegas/derby/databases/derby1",
"all_spark_types", properties)

Hope this helps,

Rick Hillegas
STSM, IBM Analytics, Platform - IBM USA


Ruslan Dautkhanov <dautkha...@gmail.com> wrote on 10/05/2015 02:44:20 PM:

> From: Ruslan Dautkhanov <dautkha...@gmail.com>
> To: user <user@spark.apache.org>
> Date: 10/05/2015 02:45 PM
> Subject: save DF to JDBC
>
> http://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-
> to-other-databases
>
> Spark JDBC can read data from JDBC, but can it save back to JDBC?
> Like to an Oracle database through its jdbc driver.
>
> Also looked at SQL Context documentation
> https://spark.apache.org/docs/1.4.0/api/java/org/apache/spark/sql/
> SQLContext.html
> and can't find anything relevant.
>
> Thanks!
>
>
> --
> Ruslan Dautkhanov

Reply via email to