Hi all,
I’m curious on how foreachbatch works in spark structured streaming. So
since it is taking in a micro batch dataframe, that means the code in
foreachbatch is executing on spark driver? Does this mean for large
batches, you could potentially have OOM issues from collecting each
partition
Hi All,
Am trying to submit my application using spark-submit in yarn mode.
But its failing because of unknown queue default, we specified the queue
name in spark-default.conf as spark.yarn.queue SecondaryQueue
its failing for one application, but for another application dont know the
reason.
Hi Venkata,
Thanks for your reply. I am using HDP 2.6 and I don't think above will work
for me, Any other suggestions? Thanks
On Thu, Mar 5, 2020 at 8:24 AM venkata naidu udamala <
vudamala.gyan...@gmail.com> wrote:
> You can try using have warehouse connector
>