aimtsou commented on PR #37817:
URL: https://github.com/apache/spark/pull/37817#issuecomment-1445189813
Thank you @srowen, really appreciated
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
aimtsou commented on PR #37817:
URL: https://github.com/apache/spark/pull/37817#issuecomment-1444586606
Yes we agree that users can limit their numpy system installation to <
1.20.0, if they use Spark 3.3
I will have to check and test the different versions but I believe according
aimtsou commented on PR #37817:
URL: https://github.com/apache/spark/pull/37817#issuecomment-1444549904
Hi @srowen,
Thank you for your very prompt reply.
You are not correct about the error, after 1.20.0 it creates an attribute
error
```
if attr in
aimtsou commented on PR #37817:
URL: https://github.com/apache/spark/pull/37817#issuecomment-1444515612
@srowen: Although this is causing an issue:
If you try to build your own docker image of Spark including pyspark while
trying to be compliant with Databricks you will observe that