> Which Python version will run that stored procedure?
>
> All Python versions supported in PySpark
>
Where in stored procedure defines the exact python version which will run
the code? That was the question.
> How to manage external dependencies?
>
> Existing way we have
> https://spark.apache.
-1
Great idea to ignore the experience of others and copy bad practices back
for nothing.
If you are familiar with Python ecosystem then you should answer the
questions:
1. Which Python version will run that stored procedure?
2. How to manage external dependencies?
3. How to test it via a common
work a little harder. Doesn't mean either position is right-er
> even, we don't need to decide that.
>
> On Tue, Aug 4, 2020 at 9:33 AM Alexander Shorin wrote:
> >
> >
> > Just no changes? Name provides no issues and is pretty clear about its
> intentions. Ra
Just no changes? Name provides no issues and is pretty clear about its
intentions. Racist links are quite overminded.
--
,,^..^,,
On Tue, Aug 4, 2020 at 5:19 PM Tom Graves
wrote:
> Hey Folks,
>
> We have jira https://issues.apache.org/jira/browse/SPARK-32037 to rename
> the blacklisting featur
What's the release due for Apache Spark 3.0? Will it be tomorrow or
somewhere at the middle of 2019 year?
I think we shouldn't care much about Python 2.x today, since quite
soon it support turns into pumpkin. For today's projects I hope nobody
takes into account support of 2.7 unless there is some