; user@spark.apache.org
Subject: Re: How to properly execute `foreachPartition` in Spark 2.2
Spark Dataset / Dataframe has foreachPartition() as well. Its implementation is
much more efficient than RDD's.
There is ton of code snippets, say
https://github.com/hdinsight/spark-streaming-data-persi
Stream to Kafka?
>
>
>
>
>
> *From: *Liana Napalkova <liana.napalk...@eurecat.org>
> *Date: *Monday, December 18, 2017 at 10:07 AM
> *To: *Silvio Fiorito <silvio.fior...@granturing.com>, "
> user@spark.apache.org" <user@spark.apache.org>
>
> *Subject:
If there is no other way, then I will follow this recommendation.
From: Silvio Fiorito <silvio.fior...@granturing.com>
Sent: 18 December 2017 16:20:03
To: Liana Napalkova; user@spark.apache.org
Subject: Re: How to properly execute `foreachPartition` in Spa
iorito <silvio.fior...@granturing.com>, "user@spark.apache.org"
<user@spark.apache.org>
Subject: Re: How to properly execute `foreachPartition` in Spark 2.2
I need to firstly read from Kafka queue into a DataFrame. Then I should perform
some transformations with the data. Finally, f
9
> *To:* Liana Napalkova; user@spark.apache.org
> *Subject:* Re: How to properly execute `foreachPartition` in Spark 2.2
>
>
> Why don’t you just use the Kafka sink for Spark 2.2?
>
>
>
> https://spark.apache.org/docs/2.2.0/structured-streaming-
> kafka-integration.
.
From: Silvio Fiorito <silvio.fior...@granturing.com>
Sent: 18 December 2017 16:00:39
To: Liana Napalkova; user@spark.apache.org
Subject: Re: How to properly execute `foreachPartition` in Spark 2.2
Why don’t you just use the Kafka sink for Spark 2.2?
"user@spark.apache.org" <user@spark.apache.org>
Subject: How to properly execute `foreachPartition` in Spark 2.2
Hi,
I wonder how to properly execute `foreachPartition` in Spark 2.2. Below I
explain the problem is details. I appreciate any help.
In Spark 1.6 I was d
Hi,
I wonder how to properly execute `foreachPartition` in Spark 2.2. Below I
explain the problem is details. I appreciate any help.
In Spark 1.6 I was doing something similar to this:
DstreamFromKafka.foreachRDD(session => {
session.foreachPartition { partitionOfReco