Hi Gabor,
I am just trying out. Desperately trying to make writing to kafka work
using spark-sql-kafka.
In my deployment project I do use the provided scope.
On Tue, Mar 19, 2019 at 8:50 AM Gabor Somogyi
wrote:
> Hi Anna,
>
> Looks like some sort of version mismatch.
>
> Presume scala
Hi,
is the structured streaming kinesis connector available as an open source
package?
Thanks and Regards,
Gourav Sengupta
On Mon, Nov 13, 2017 at 11:30 PM Benjamin Kim wrote:
> To add, we have a CDH 5.12 cluster with Spark 2.2 in our data center.
>
> On Mon, Nov 13, 2017 at 3:15 PM
Hi,
We have a use case where specific kafka partition data needs to be assigned
to specific execuotr node.
In Spark Streaming, this can be achieved using
LocationStrategies.Preferfixed .How do we achieve same in structured
streaming??
*Spark Streaming*
Map partitionMapToHost = new
Hi Anna,
Looks like some sort of version mismatch.
Presume scala version double checked...
Not sure why the mentioned artifacts are not in provided scope.
It will end-up in significantly smaller jar + these artifacts should be
available on the cluster (either by default for example core or due
Hi Gabor,
Thank you for the response.
I do have those dependencies added.
org.apache.spark
spark-core_2.11
2.2.2
org.apache.spark
spark-sql_2.11
2.2.2
org.apache.spark
spark-sql-kafka-0-10_2.11
2.2.2
and my kafka version is kafka_2.11-1.1.0
On Tue, Mar 19, 2019 at
Hi All ,
I am using spark 2.2 in EMR cluster. I have a hive table in ORC format and
I need to create a persistent view on top of this hive table. I am using
spark sql to create the view.
By default spark sql creates the view with LazySerde. How can I change the
inputformat to use ORC ?
PFA
Hi Anna,
Have you added spark-sql-kafka-0-10_2.11:2.2.0 package as well?
Further info can be found here:
https://spark.apache.org/docs/2.2.0/structured-streaming-kafka-integration.html#deploying
The same --packages option can be used with spark-shell as well...
BR,
G
On Mon, Mar 18, 2019 at