Hi Karthick,
If you have confirmed that the incompatibility between Delta and spark
versions is not the case, then I would say the same what Jacek said
earlier, there’s not enough “data” here.
To further comment on it, we would need to know more on how you are
structuring your multi threaded PySp
Hello,
In Spark windowing does call with Window().partitionBy() can cause
shuffle to take place?
If so what is the performance impact if any if the data result set is large.
Thanks
Hi Farhan,
Thank you for your response, I am using databricks with 11.3x-scala2.12.
Here I am overwriting all the tables in the same database in concurrent
thread, But when I do in the iterative manner it is working fine,
For Example, i am having 200 tables in same database, i am overwriting the