Hi,
We have a use case where specific kafka partition data needs to be assigned
to specific execuotr node.
In Spark Streaming, this can be achieved using
LocationStrategies.Preferfixed .How do we achieve same in structured
streaming??
*Spark Streaming*
Map partitionMapToHost = new
Hi,
When we execute drop partition command on hive external table from
spark-shell we are getting below error.Same command works fine from hive
shell.
It is a table with just two records
Spark Version : 1.5.2
scala> hiveCtx.sql("select * from
spark_2_test").collect().foreach(println);