Re: How to branch a Stream / have multiple Sinks / do multiple Queries on one Stream

2018-07-06 Thread Amiya Mishra
Hi Tathagata, Is there any limitation of below code while writing to multiple file ? val inputdf:DataFrame = sparkSession.readStream.schema(schema).format("csv").option("delimiter",",").csv("src/main/streamingInput") query1 =

Re: How to branch a Stream / have multiple Sinks / do multiple Queries on one Stream

2018-07-05 Thread Amiya Mishra
Hi Chandan/Jürgen, I had tried through a native code having single input data frame with multiple sinks as : Spark provides a method called awaitAnyTermination() in StreamingQueryManager.scala which provides all the required details to handle the query processed by spark.By observing

Re: How to branch a Stream / have multiple Sinks / do multiple Queries on one Stream

2018-06-13 Thread Amiya Mishra
Hi Jürgen, Have you found any solution or workaround for multiple sinks from single source as we cannot process multiple sinks at a time ? As i also has a scenario in ETL where we are using clone component having multiple sinks with single input stream dataframe. Can you keep posting once you

Reason behind mapping of StringType with CLOB nullType

2017-01-30 Thread Amiya Mishra
Hi, I am new to spark-sql. I am getting below mapping details in JdbcUtils.scala as: *case StringType => Option(JdbcType("TEXT", java.sql.Types.CLOB))* in line number 125. which says SringType will map with Jdbc database type as "TEXT" having jdbc null type as CLOB , which internally takes 2005