Hi Anna,
Another classic problem source is the classpath. Check it and look for
version which is not 2.2.2...
BR,
G
On Wed, Mar 20, 2019 at 12:56 AM anna stax wrote:
> Hi Gabor,
>
> I am just trying out. Desperately trying to make writing to kafka work
> using spark-sql-kafka.
> In my
Hi Gabor,
I am just trying out. Desperately trying to make writing to kafka work
using spark-sql-kafka.
In my deployment project I do use the provided scope.
On Tue, Mar 19, 2019 at 8:50 AM Gabor Somogyi
wrote:
> Hi Anna,
>
> Looks like some sort of version mismatch.
>
> Presume scala
Hi Anna,
Looks like some sort of version mismatch.
Presume scala version double checked...
Not sure why the mentioned artifacts are not in provided scope.
It will end-up in significantly smaller jar + these artifacts should be
available on the cluster (either by default for example core or due
Hi Gabor,
Thank you for the response.
I do have those dependencies added.
org.apache.spark
spark-core_2.11
2.2.2
org.apache.spark
spark-sql_2.11
2.2.2
org.apache.spark
spark-sql-kafka-0-10_2.11
2.2.2
and my kafka version is kafka_2.11-1.1.0
On Tue, Mar 19, 2019 at
Hi Anna,
Have you added spark-sql-kafka-0-10_2.11:2.2.0 package as well?
Further info can be found here:
https://spark.apache.org/docs/2.2.0/structured-streaming-kafka-integration.html#deploying
The same --packages option can be used with spark-shell as well...
BR,
G
On Mon, Mar 18, 2019 at
Hi all,
I am unable to write the contents of spark dataframe to Kafka.
I am using Spark 2.2
This is my code
val df = Seq(("1","One"),("2","two")).toDF("key","value")
df.printSchema()
df.show(false)
df.selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)")
.write
.format("kafka")