how do you do this with structured streaming? i see no mention of writing
to kafka

On Fri, Jan 13, 2017 at 10:30 AM, Peyman Mohajerian <mohaj...@gmail.com>
wrote:

> Yes, it is called Structured Streaming: https://docs.
> databricks.com/_static/notebooks/structured-streaming-kafka.html
> http://spark.apache.org/docs/latest/structured-streaming-
> programming-guide.html
>
> On Fri, Jan 13, 2017 at 3:32 AM, Senthil Kumar <senthilec...@gmail.com>
> wrote:
>
>> Hi Team ,
>>
>>      Sorry if this question already asked in this forum..
>>
>> Can we ingest data to Apache Kafka Topic from Spark SQL DataFrame ??
>>
>> Here is my Code which Reads Parquet File :
>>
>> *val sqlContext = new org.apache.spark.sql.SQLContext(sc);*
>>
>> *val df = sqlContext.read.parquet("..../temp/*.parquet")*
>>
>> *df.registerTempTable("beacons")*
>>
>>
>> I want to directly ingest df DataFrame to Kafka ! Is there any way to
>> achieve this ??
>>
>>
>> Cheers,
>>
>> Senthil
>>
>
>

Reply via email to