spark streaming to jdbc

2021-09-03 Thread igyu
val lines = spark.readStream .format("socket") // .schema(StructType(schemas)) .option("host", "10.3.87.23") .option("port", ) .load() .selectExpr("CAST(value AS STRING)").as[(String)]DF = lines.map(x => { val obj = JSON.parseObject(x) val ls = new util.ArrayList()

Spark Streaming Reusing JDBC Connections

2014-12-05 Thread Asim Jalis
Is there a way I can have a JDBC connection open through a streaming job. I have a foreach which is running once per batch. However, I don’t want to open the connection for each batch but would rather have a persistent connection that I can reuse. How can I do this? Thanks. Asim

RE: Spark Streaming Reusing JDBC Connections

2014-12-05 Thread Ashic Mahtab
I've done this: 1. foreachPartition 2. Open connection. 3. foreach inside the partition. 4. close the connection. Slightly crufty, but works. Would love to see a better approach. Regards, Ashic. Date: Fri, 5 Dec 2014 12:32:24 -0500 Subject: Spark Streaming Reusing JDBC Connections From: asimja