Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-15 Thread Dongjoon Hyun
Thank you all. This vote passed.

Let me conclude.

Dongjoon

On 2023/12/11 23:58:28 Malcolm Decuire wrote:
> +1
> 
> On Mon, Dec 11, 2023 at 6:21 PM Yang Jie  wrote:
> 
> > +1
> >
> > On 2023/12/11 03:03:39 "L. C. Hsieh" wrote:
> > > +1
> > >
> > > On Sun, Dec 10, 2023 at 6:15 PM Kent Yao  wrote:
> > > >
> > > > +1(non-binding
> > > >
> > > > Kent Yao
> > > >
> > > > Yuming Wang  于2023年12月11日周一 09:33写道:
> > > > >
> > > > > +1
> > > > >
> > > > > On Mon, Dec 11, 2023 at 5:55 AM Dongjoon Hyun 
> > wrote:
> > > > >>
> > > > >> +1
> > > > >>
> > > > >> Dongjoon
> > > > >>
> > > > >> On 2023/12/08 21:41:00 Dongjoon Hyun wrote:
> > > > >> > Please vote on releasing the following candidate as Apache Spark
> > version
> > > > >> > 3.3.4.
> > > > >> >
> > > > >> > The vote is open until December 15th 1AM (PST) and passes if a
> > majority +1
> > > > >> > PMC votes are cast, with a minimum of 3 +1 votes.
> > > > >> >
> > > > >> > [ ] +1 Release this package as Apache Spark 3.3.4
> > > > >> > [ ] -1 Do not release this package because ...
> > > > >> >
> > > > >> > To learn more about Apache Spark, please see
> > https://spark.apache.org/
> > > > >> >
> > > > >> > The tag to be voted on is v3.3.4-rc1 (commit
> > > > >> > 18db204995b32e87a650f2f09f9bcf047ddafa90)
> > > > >> > https://github.com/apache/spark/tree/v3.3.4-rc1
> > > > >> >
> > > > >> > The release files, including signatures, digests, etc. can be
> > found at:
> > > > >> >
> > > > >> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-bin/
> > > > >> >
> > > > >> >
> > > > >> > Signatures used for Spark RCs can be found in this file:
> > > > >> >
> > > > >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> > > > >> >
> > > > >> >
> > > > >> > The staging repository for this release can be found at:
> > > > >> >
> > > > >> >
> > https://repository.apache.org/content/repositories/orgapachespark-1451/
> > > > >> >
> > > > >> >
> > > > >> > The documentation corresponding to this release can be found at:
> > > > >> >
> > > > >> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-docs/
> > > > >> >
> > > > >> >
> > > > >> > The list of bug fixes going into 3.3.4 can be found at the
> > following URL:
> > > > >> >
> > > > >> > https://issues.apache.org/jira/projects/SPARK/versions/12353505
> > > > >> >
> > > > >> >
> > > > >> > This release is using the release script of the tag v3.3.4-rc1.
> > > > >> >
> > > > >> >
> > > > >> > FAQ
> > > > >> >
> > > > >> >
> > > > >> > =
> > > > >> >
> > > > >> > How can I help test this release?
> > > > >> >
> > > > >> > =
> > > > >> >
> > > > >> >
> > > > >> >
> > > > >> > If you are a Spark user, you can help us test this release by
> > taking
> > > > >> >
> > > > >> > an existing Spark workload and running on this release candidate,
> > then
> > > > >> >
> > > > >> > reporting any regressions.
> > > > >> >
> > > > >> >
> > > > >> >
> > > > >> > If you're working in PySpark you can set up a virtual env and
> > install
> > > > >> >
> > > > >> > the current RC and see if anything important breaks, in the
> > Java/Scala
> > > > >> >
> > > > >> > you can add the staging repository to your projects resolvers and
> > test
> > > > >> >
> > > > >> > with the RC (make sure to clean up the artifact cache
> > before/after so
> > > > >> >
> > > > >> > you don't end up building with a out of date RC going forward).
> > > > >> >
> > > > >> >
> > > > >> >
> > > > >> > ===
> > > > >> >
> > > > >> > What should happen to JIRA tickets still targeting 3.3.4?
> > > > >> >
> > > > >> > ===
> > > > >> >
> > > > >> >
> > > > >> >
> > > > >> > The current list of open tickets targeted at 3.3.4 can be found
> > at:
> > > > >> >
> > > > >> > https://issues.apache.org/jira/projects/SPARK and search for
> > "Target
> > > > >> > Version/s" = 3.3.4
> > > > >> >
> > > > >> >
> > > > >> > Committers should look at those and triage. Extremely important
> > bug
> > > > >> >
> > > > >> > fixes, documentation, and API tweaks that impact compatibility
> > should
> > > > >> >
> > > > >> > be worked on immediately. Everything else please retarget to an
> > > > >> >
> > > > >> > appropriate release.
> > > > >> >
> > > > >> >
> > > > >> >
> > > > >> > ==
> > > > >> >
> > > > >> > But my bug isn't fixed?
> > > > >> >
> > > > >> > ==
> > > > >> >
> > > > >> >
> > > > >> >
> > > > >> > In order to make timely releases, we will typically not hold the
> > > > >> >
> > > > >> > release unless the bug in question is a regression from the
> > previous
> > > > >> >
> > > > >> > release. That being said, if there is something which is a
> > regression
> > > > >> >
> > > > >> > that has not been correctly targeted please ping me or a
> > committer to
> > > > >> >
> > > > >> > help target the issue.
> > > > >> >
> > > > >>
> > > > >>
> > 

Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-11 Thread Malcolm Decuire
+1

On Mon, Dec 11, 2023 at 6:21 PM Yang Jie  wrote:

> +1
>
> On 2023/12/11 03:03:39 "L. C. Hsieh" wrote:
> > +1
> >
> > On Sun, Dec 10, 2023 at 6:15 PM Kent Yao  wrote:
> > >
> > > +1(non-binding
> > >
> > > Kent Yao
> > >
> > > Yuming Wang  于2023年12月11日周一 09:33写道:
> > > >
> > > > +1
> > > >
> > > > On Mon, Dec 11, 2023 at 5:55 AM Dongjoon Hyun 
> wrote:
> > > >>
> > > >> +1
> > > >>
> > > >> Dongjoon
> > > >>
> > > >> On 2023/12/08 21:41:00 Dongjoon Hyun wrote:
> > > >> > Please vote on releasing the following candidate as Apache Spark
> version
> > > >> > 3.3.4.
> > > >> >
> > > >> > The vote is open until December 15th 1AM (PST) and passes if a
> majority +1
> > > >> > PMC votes are cast, with a minimum of 3 +1 votes.
> > > >> >
> > > >> > [ ] +1 Release this package as Apache Spark 3.3.4
> > > >> > [ ] -1 Do not release this package because ...
> > > >> >
> > > >> > To learn more about Apache Spark, please see
> https://spark.apache.org/
> > > >> >
> > > >> > The tag to be voted on is v3.3.4-rc1 (commit
> > > >> > 18db204995b32e87a650f2f09f9bcf047ddafa90)
> > > >> > https://github.com/apache/spark/tree/v3.3.4-rc1
> > > >> >
> > > >> > The release files, including signatures, digests, etc. can be
> found at:
> > > >> >
> > > >> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-bin/
> > > >> >
> > > >> >
> > > >> > Signatures used for Spark RCs can be found in this file:
> > > >> >
> > > >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> > > >> >
> > > >> >
> > > >> > The staging repository for this release can be found at:
> > > >> >
> > > >> >
> https://repository.apache.org/content/repositories/orgapachespark-1451/
> > > >> >
> > > >> >
> > > >> > The documentation corresponding to this release can be found at:
> > > >> >
> > > >> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-docs/
> > > >> >
> > > >> >
> > > >> > The list of bug fixes going into 3.3.4 can be found at the
> following URL:
> > > >> >
> > > >> > https://issues.apache.org/jira/projects/SPARK/versions/12353505
> > > >> >
> > > >> >
> > > >> > This release is using the release script of the tag v3.3.4-rc1.
> > > >> >
> > > >> >
> > > >> > FAQ
> > > >> >
> > > >> >
> > > >> > =
> > > >> >
> > > >> > How can I help test this release?
> > > >> >
> > > >> > =
> > > >> >
> > > >> >
> > > >> >
> > > >> > If you are a Spark user, you can help us test this release by
> taking
> > > >> >
> > > >> > an existing Spark workload and running on this release candidate,
> then
> > > >> >
> > > >> > reporting any regressions.
> > > >> >
> > > >> >
> > > >> >
> > > >> > If you're working in PySpark you can set up a virtual env and
> install
> > > >> >
> > > >> > the current RC and see if anything important breaks, in the
> Java/Scala
> > > >> >
> > > >> > you can add the staging repository to your projects resolvers and
> test
> > > >> >
> > > >> > with the RC (make sure to clean up the artifact cache
> before/after so
> > > >> >
> > > >> > you don't end up building with a out of date RC going forward).
> > > >> >
> > > >> >
> > > >> >
> > > >> > ===
> > > >> >
> > > >> > What should happen to JIRA tickets still targeting 3.3.4?
> > > >> >
> > > >> > ===
> > > >> >
> > > >> >
> > > >> >
> > > >> > The current list of open tickets targeted at 3.3.4 can be found
> at:
> > > >> >
> > > >> > https://issues.apache.org/jira/projects/SPARK and search for
> "Target
> > > >> > Version/s" = 3.3.4
> > > >> >
> > > >> >
> > > >> > Committers should look at those and triage. Extremely important
> bug
> > > >> >
> > > >> > fixes, documentation, and API tweaks that impact compatibility
> should
> > > >> >
> > > >> > be worked on immediately. Everything else please retarget to an
> > > >> >
> > > >> > appropriate release.
> > > >> >
> > > >> >
> > > >> >
> > > >> > ==
> > > >> >
> > > >> > But my bug isn't fixed?
> > > >> >
> > > >> > ==
> > > >> >
> > > >> >
> > > >> >
> > > >> > In order to make timely releases, we will typically not hold the
> > > >> >
> > > >> > release unless the bug in question is a regression from the
> previous
> > > >> >
> > > >> > release. That being said, if there is something which is a
> regression
> > > >> >
> > > >> > that has not been correctly targeted please ping me or a
> committer to
> > > >> >
> > > >> > help target the issue.
> > > >> >
> > > >>
> > > >>
> -
> > > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > > >>
> > >
> > > -
> > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >
> >
> > -
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
> >
>
> 

Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-11 Thread Yang Jie
+1

On 2023/12/11 03:03:39 "L. C. Hsieh" wrote:
> +1
> 
> On Sun, Dec 10, 2023 at 6:15 PM Kent Yao  wrote:
> >
> > +1(non-binding
> >
> > Kent Yao
> >
> > Yuming Wang  于2023年12月11日周一 09:33写道:
> > >
> > > +1
> > >
> > > On Mon, Dec 11, 2023 at 5:55 AM Dongjoon Hyun  wrote:
> > >>
> > >> +1
> > >>
> > >> Dongjoon
> > >>
> > >> On 2023/12/08 21:41:00 Dongjoon Hyun wrote:
> > >> > Please vote on releasing the following candidate as Apache Spark 
> > >> > version
> > >> > 3.3.4.
> > >> >
> > >> > The vote is open until December 15th 1AM (PST) and passes if a 
> > >> > majority +1
> > >> > PMC votes are cast, with a minimum of 3 +1 votes.
> > >> >
> > >> > [ ] +1 Release this package as Apache Spark 3.3.4
> > >> > [ ] -1 Do not release this package because ...
> > >> >
> > >> > To learn more about Apache Spark, please see https://spark.apache.org/
> > >> >
> > >> > The tag to be voted on is v3.3.4-rc1 (commit
> > >> > 18db204995b32e87a650f2f09f9bcf047ddafa90)
> > >> > https://github.com/apache/spark/tree/v3.3.4-rc1
> > >> >
> > >> > The release files, including signatures, digests, etc. can be found at:
> > >> >
> > >> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-bin/
> > >> >
> > >> >
> > >> > Signatures used for Spark RCs can be found in this file:
> > >> >
> > >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> > >> >
> > >> >
> > >> > The staging repository for this release can be found at:
> > >> >
> > >> > https://repository.apache.org/content/repositories/orgapachespark-1451/
> > >> >
> > >> >
> > >> > The documentation corresponding to this release can be found at:
> > >> >
> > >> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-docs/
> > >> >
> > >> >
> > >> > The list of bug fixes going into 3.3.4 can be found at the following 
> > >> > URL:
> > >> >
> > >> > https://issues.apache.org/jira/projects/SPARK/versions/12353505
> > >> >
> > >> >
> > >> > This release is using the release script of the tag v3.3.4-rc1.
> > >> >
> > >> >
> > >> > FAQ
> > >> >
> > >> >
> > >> > =
> > >> >
> > >> > How can I help test this release?
> > >> >
> > >> > =
> > >> >
> > >> >
> > >> >
> > >> > If you are a Spark user, you can help us test this release by taking
> > >> >
> > >> > an existing Spark workload and running on this release candidate, then
> > >> >
> > >> > reporting any regressions.
> > >> >
> > >> >
> > >> >
> > >> > If you're working in PySpark you can set up a virtual env and install
> > >> >
> > >> > the current RC and see if anything important breaks, in the Java/Scala
> > >> >
> > >> > you can add the staging repository to your projects resolvers and test
> > >> >
> > >> > with the RC (make sure to clean up the artifact cache before/after so
> > >> >
> > >> > you don't end up building with a out of date RC going forward).
> > >> >
> > >> >
> > >> >
> > >> > ===
> > >> >
> > >> > What should happen to JIRA tickets still targeting 3.3.4?
> > >> >
> > >> > ===
> > >> >
> > >> >
> > >> >
> > >> > The current list of open tickets targeted at 3.3.4 can be found at:
> > >> >
> > >> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> > >> > Version/s" = 3.3.4
> > >> >
> > >> >
> > >> > Committers should look at those and triage. Extremely important bug
> > >> >
> > >> > fixes, documentation, and API tweaks that impact compatibility should
> > >> >
> > >> > be worked on immediately. Everything else please retarget to an
> > >> >
> > >> > appropriate release.
> > >> >
> > >> >
> > >> >
> > >> > ==
> > >> >
> > >> > But my bug isn't fixed?
> > >> >
> > >> > ==
> > >> >
> > >> >
> > >> >
> > >> > In order to make timely releases, we will typically not hold the
> > >> >
> > >> > release unless the bug in question is a regression from the previous
> > >> >
> > >> > release. That being said, if there is something which is a regression
> > >> >
> > >> > that has not been correctly targeted please ping me or a committer to
> > >> >
> > >> > help target the issue.
> > >> >
> > >>
> > >> -
> > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >>
> >
> > -
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
> 
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> 
> 

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-11 Thread Dongjoon Hyun
Hi, Mridul.

> I am currently on Python 3.11.6, java 8.

For the above, I added `Python 3.11 support` at Apache Spark 3.4.0. That's
exactly one of my reasons why I wanted to do the EOL release of Apache
Spark 3.3.4.

https://issues.apache.org/jira/browse/SPARK-41454 (Support Python 3.11)

Thanks,
Dongjoon.




On Mon, Dec 11, 2023 at 12:22 PM Mridul Muralidharan 
wrote:

>
> I am seeing a bunch of python related (43) failures in the sql module (for
> example [1]) ... I am currently on Python 3.11.6, java 8.
> Not sure if ubuntu modified anything from under me, thoughts ?
>
> I am currently testing this against an older branch to make sure it is not
> an issue with my desktop.
>
> Regards,
> Mridul
>
>
> [1]
>
>
> org.apache.spark.sql.IntegratedUDFTestUtils.shouldTestGroupedAggPandasUDFs
> was false (QueryCompilationErrorsSuite.scala:112)
> Traceback (most recent call last):
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/serializers.py", line
> 458, in dumps
> return cloudpickle.dumps(obj, pickle_protocol)
>^^^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 73, in dumps
> cp.dump(obj)
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 602, in dump
> return Pickler.dump(self, obj)
>^^^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 692, in reducer_override
> return self._function_reduce(obj)
>^^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 565, in _function_reduce
> return self._dynamic_function_reduce(obj)
>^^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 546, in _dynamic_function_reduce
> state = _function_getstate(func)
> 
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 157, in _function_getstate
> f_globals_ref = _extract_code_globals(func.__code__)
> 
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle.py",
> line 334, in _extract_code_globals
> out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
> ^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle.py",
> line 334, in 
> out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
>  ~^^^
> IndexError: tuple index out of range
> Traceback (most recent call last):
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/serializers.py", line
> 458, in dumps
> return cloudpickle.dumps(obj, pickle_protocol)
>^^^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 73, in dumps
> cp.dump(obj)
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 602, in dump
> return Pickler.dump(self, obj)
>^^^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 692, in reducer_override
> return self._function_reduce(obj)
>^^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 565, in _function_reduce
> return self._dynamic_function_reduce(obj)
>^^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 546, in _dynamic_function_reduce
> state = _function_getstate(func)
> 
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
> line 157, in _function_getstate
> f_globals_ref = _extract_code_globals(func.__code__)
> 
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle.py",
> line 334, in _extract_code_globals
> out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
> ^
>   File
> "/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle.py",
> line 334, in 
> out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
>  ~^^^
> IndexError: tuple index out of range
>
> During handling of the above exception, another exception occurred:
>
> Traceback (most recent call last):
>   File "", line 1, in 
>  

Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-11 Thread Mridul Muralidharan
I am seeing a bunch of python related (43) failures in the sql module (for
example [1]) ... I am currently on Python 3.11.6, java 8.
Not sure if ubuntu modified anything from under me, thoughts ?

I am currently testing this against an older branch to make sure it is not
an issue with my desktop.

Regards,
Mridul


[1]


org.apache.spark.sql.IntegratedUDFTestUtils.shouldTestGroupedAggPandasUDFs
was false (QueryCompilationErrorsSuite.scala:112)
Traceback (most recent call last):
  File "/home/mridul/work/apache/vote/spark/python/pyspark/serializers.py",
line 458, in dumps
return cloudpickle.dumps(obj, pickle_protocol)
   ^^^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 73, in dumps
cp.dump(obj)
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 602, in dump
return Pickler.dump(self, obj)
   ^^^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 692, in reducer_override
return self._function_reduce(obj)
   ^^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 565, in _function_reduce
return self._dynamic_function_reduce(obj)
   ^^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 546, in _dynamic_function_reduce
state = _function_getstate(func)

  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 157, in _function_getstate
f_globals_ref = _extract_code_globals(func.__code__)

  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle.py",
line 334, in _extract_code_globals
out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle.py",
line 334, in 
out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
 ~^^^
IndexError: tuple index out of range
Traceback (most recent call last):
  File "/home/mridul/work/apache/vote/spark/python/pyspark/serializers.py",
line 458, in dumps
return cloudpickle.dumps(obj, pickle_protocol)
   ^^^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 73, in dumps
cp.dump(obj)
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 602, in dump
return Pickler.dump(self, obj)
   ^^^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 692, in reducer_override
return self._function_reduce(obj)
   ^^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 565, in _function_reduce
return self._dynamic_function_reduce(obj)
   ^^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 546, in _dynamic_function_reduce
state = _function_getstate(func)

  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle_fast.py",
line 157, in _function_getstate
f_globals_ref = _extract_code_globals(func.__code__)

  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle.py",
line 334, in _extract_code_globals
out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
^
  File
"/home/mridul/work/apache/vote/spark/python/pyspark/cloudpickle/cloudpickle.py",
line 334, in 
out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
 ~^^^
IndexError: tuple index out of range

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "", line 1, in 
  File "/home/mridul/work/apache/vote/spark/python/pyspark/serializers.py",
line 468, in dumps
raise pickle.PicklingError(msg)
_pickle.PicklingError: Could not serialize object: IndexError: tuple index
out of range
- UNSUPPORTED_FEATURE: Using Python UDF with unsupported join condition ***
FAILED ***



On Sun, Dec 10, 2023 at 9:05 PM L. C. Hsieh  wrote:

> +1
>
> On Sun, Dec 10, 2023 at 6:15 PM Kent Yao  wrote:
> >
> > +1(non-binding
> >
> > Kent Yao
> >
> > Yuming Wang  于2023年12月11日周一 09:33写道:
> > >
> > > +1
> > >
> > > On Mon, Dec 11, 2023 at 5:55 AM Dongjoon Hyun 
> wrote:
> > >>
> 

Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-10 Thread L. C. Hsieh
+1

On Sun, Dec 10, 2023 at 6:15 PM Kent Yao  wrote:
>
> +1(non-binding
>
> Kent Yao
>
> Yuming Wang  于2023年12月11日周一 09:33写道:
> >
> > +1
> >
> > On Mon, Dec 11, 2023 at 5:55 AM Dongjoon Hyun  wrote:
> >>
> >> +1
> >>
> >> Dongjoon
> >>
> >> On 2023/12/08 21:41:00 Dongjoon Hyun wrote:
> >> > Please vote on releasing the following candidate as Apache Spark version
> >> > 3.3.4.
> >> >
> >> > The vote is open until December 15th 1AM (PST) and passes if a majority 
> >> > +1
> >> > PMC votes are cast, with a minimum of 3 +1 votes.
> >> >
> >> > [ ] +1 Release this package as Apache Spark 3.3.4
> >> > [ ] -1 Do not release this package because ...
> >> >
> >> > To learn more about Apache Spark, please see https://spark.apache.org/
> >> >
> >> > The tag to be voted on is v3.3.4-rc1 (commit
> >> > 18db204995b32e87a650f2f09f9bcf047ddafa90)
> >> > https://github.com/apache/spark/tree/v3.3.4-rc1
> >> >
> >> > The release files, including signatures, digests, etc. can be found at:
> >> >
> >> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-bin/
> >> >
> >> >
> >> > Signatures used for Spark RCs can be found in this file:
> >> >
> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >> >
> >> >
> >> > The staging repository for this release can be found at:
> >> >
> >> > https://repository.apache.org/content/repositories/orgapachespark-1451/
> >> >
> >> >
> >> > The documentation corresponding to this release can be found at:
> >> >
> >> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-docs/
> >> >
> >> >
> >> > The list of bug fixes going into 3.3.4 can be found at the following URL:
> >> >
> >> > https://issues.apache.org/jira/projects/SPARK/versions/12353505
> >> >
> >> >
> >> > This release is using the release script of the tag v3.3.4-rc1.
> >> >
> >> >
> >> > FAQ
> >> >
> >> >
> >> > =
> >> >
> >> > How can I help test this release?
> >> >
> >> > =
> >> >
> >> >
> >> >
> >> > If you are a Spark user, you can help us test this release by taking
> >> >
> >> > an existing Spark workload and running on this release candidate, then
> >> >
> >> > reporting any regressions.
> >> >
> >> >
> >> >
> >> > If you're working in PySpark you can set up a virtual env and install
> >> >
> >> > the current RC and see if anything important breaks, in the Java/Scala
> >> >
> >> > you can add the staging repository to your projects resolvers and test
> >> >
> >> > with the RC (make sure to clean up the artifact cache before/after so
> >> >
> >> > you don't end up building with a out of date RC going forward).
> >> >
> >> >
> >> >
> >> > ===
> >> >
> >> > What should happen to JIRA tickets still targeting 3.3.4?
> >> >
> >> > ===
> >> >
> >> >
> >> >
> >> > The current list of open tickets targeted at 3.3.4 can be found at:
> >> >
> >> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> >> > Version/s" = 3.3.4
> >> >
> >> >
> >> > Committers should look at those and triage. Extremely important bug
> >> >
> >> > fixes, documentation, and API tweaks that impact compatibility should
> >> >
> >> > be worked on immediately. Everything else please retarget to an
> >> >
> >> > appropriate release.
> >> >
> >> >
> >> >
> >> > ==
> >> >
> >> > But my bug isn't fixed?
> >> >
> >> > ==
> >> >
> >> >
> >> >
> >> > In order to make timely releases, we will typically not hold the
> >> >
> >> > release unless the bug in question is a regression from the previous
> >> >
> >> > release. That being said, if there is something which is a regression
> >> >
> >> > that has not been correctly targeted please ping me or a committer to
> >> >
> >> > help target the issue.
> >> >
> >>
> >> -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-10 Thread Kent Yao
+1(non-binding

Kent Yao

Yuming Wang  于2023年12月11日周一 09:33写道:
>
> +1
>
> On Mon, Dec 11, 2023 at 5:55 AM Dongjoon Hyun  wrote:
>>
>> +1
>>
>> Dongjoon
>>
>> On 2023/12/08 21:41:00 Dongjoon Hyun wrote:
>> > Please vote on releasing the following candidate as Apache Spark version
>> > 3.3.4.
>> >
>> > The vote is open until December 15th 1AM (PST) and passes if a majority +1
>> > PMC votes are cast, with a minimum of 3 +1 votes.
>> >
>> > [ ] +1 Release this package as Apache Spark 3.3.4
>> > [ ] -1 Do not release this package because ...
>> >
>> > To learn more about Apache Spark, please see https://spark.apache.org/
>> >
>> > The tag to be voted on is v3.3.4-rc1 (commit
>> > 18db204995b32e87a650f2f09f9bcf047ddafa90)
>> > https://github.com/apache/spark/tree/v3.3.4-rc1
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> >
>> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-bin/
>> >
>> >
>> > Signatures used for Spark RCs can be found in this file:
>> >
>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >
>> >
>> > The staging repository for this release can be found at:
>> >
>> > https://repository.apache.org/content/repositories/orgapachespark-1451/
>> >
>> >
>> > The documentation corresponding to this release can be found at:
>> >
>> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-docs/
>> >
>> >
>> > The list of bug fixes going into 3.3.4 can be found at the following URL:
>> >
>> > https://issues.apache.org/jira/projects/SPARK/versions/12353505
>> >
>> >
>> > This release is using the release script of the tag v3.3.4-rc1.
>> >
>> >
>> > FAQ
>> >
>> >
>> > =
>> >
>> > How can I help test this release?
>> >
>> > =
>> >
>> >
>> >
>> > If you are a Spark user, you can help us test this release by taking
>> >
>> > an existing Spark workload and running on this release candidate, then
>> >
>> > reporting any regressions.
>> >
>> >
>> >
>> > If you're working in PySpark you can set up a virtual env and install
>> >
>> > the current RC and see if anything important breaks, in the Java/Scala
>> >
>> > you can add the staging repository to your projects resolvers and test
>> >
>> > with the RC (make sure to clean up the artifact cache before/after so
>> >
>> > you don't end up building with a out of date RC going forward).
>> >
>> >
>> >
>> > ===
>> >
>> > What should happen to JIRA tickets still targeting 3.3.4?
>> >
>> > ===
>> >
>> >
>> >
>> > The current list of open tickets targeted at 3.3.4 can be found at:
>> >
>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>> > Version/s" = 3.3.4
>> >
>> >
>> > Committers should look at those and triage. Extremely important bug
>> >
>> > fixes, documentation, and API tweaks that impact compatibility should
>> >
>> > be worked on immediately. Everything else please retarget to an
>> >
>> > appropriate release.
>> >
>> >
>> >
>> > ==
>> >
>> > But my bug isn't fixed?
>> >
>> > ==
>> >
>> >
>> >
>> > In order to make timely releases, we will typically not hold the
>> >
>> > release unless the bug in question is a regression from the previous
>> >
>> > release. That being said, if there is something which is a regression
>> >
>> > that has not been correctly targeted please ping me or a committer to
>> >
>> > help target the issue.
>> >
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-10 Thread Yuming Wang
+1

On Mon, Dec 11, 2023 at 5:55 AM Dongjoon Hyun  wrote:

> +1
>
> Dongjoon
>
> On 2023/12/08 21:41:00 Dongjoon Hyun wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> > 3.3.4.
> >
> > The vote is open until December 15th 1AM (PST) and passes if a majority
> +1
> > PMC votes are cast, with a minimum of 3 +1 votes.
> >
> > [ ] +1 Release this package as Apache Spark 3.3.4
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see https://spark.apache.org/
> >
> > The tag to be voted on is v3.3.4-rc1 (commit
> > 18db204995b32e87a650f2f09f9bcf047ddafa90)
> > https://github.com/apache/spark/tree/v3.3.4-rc1
> >
> > The release files, including signatures, digests, etc. can be found at:
> >
> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-bin/
> >
> >
> > Signatures used for Spark RCs can be found in this file:
> >
> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >
> >
> > The staging repository for this release can be found at:
> >
> > https://repository.apache.org/content/repositories/orgapachespark-1451/
> >
> >
> > The documentation corresponding to this release can be found at:
> >
> > https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-docs/
> >
> >
> > The list of bug fixes going into 3.3.4 can be found at the following URL:
> >
> > https://issues.apache.org/jira/projects/SPARK/versions/12353505
> >
> >
> > This release is using the release script of the tag v3.3.4-rc1.
> >
> >
> > FAQ
> >
> >
> > =
> >
> > How can I help test this release?
> >
> > =
> >
> >
> >
> > If you are a Spark user, you can help us test this release by taking
> >
> > an existing Spark workload and running on this release candidate, then
> >
> > reporting any regressions.
> >
> >
> >
> > If you're working in PySpark you can set up a virtual env and install
> >
> > the current RC and see if anything important breaks, in the Java/Scala
> >
> > you can add the staging repository to your projects resolvers and test
> >
> > with the RC (make sure to clean up the artifact cache before/after so
> >
> > you don't end up building with a out of date RC going forward).
> >
> >
> >
> > ===
> >
> > What should happen to JIRA tickets still targeting 3.3.4?
> >
> > ===
> >
> >
> >
> > The current list of open tickets targeted at 3.3.4 can be found at:
> >
> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> > Version/s" = 3.3.4
> >
> >
> > Committers should look at those and triage. Extremely important bug
> >
> > fixes, documentation, and API tweaks that impact compatibility should
> >
> > be worked on immediately. Everything else please retarget to an
> >
> > appropriate release.
> >
> >
> >
> > ==
> >
> > But my bug isn't fixed?
> >
> > ==
> >
> >
> >
> > In order to make timely releases, we will typically not hold the
> >
> > release unless the bug in question is a regression from the previous
> >
> > release. That being said, if there is something which is a regression
> >
> > that has not been correctly targeted please ping me or a committer to
> >
> > help target the issue.
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Spark 3.3.4 (RC1)

2023-12-10 Thread Dongjoon Hyun
+1

Dongjoon

On 2023/12/08 21:41:00 Dongjoon Hyun wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 3.3.4.
> 
> The vote is open until December 15th 1AM (PST) and passes if a majority +1
> PMC votes are cast, with a minimum of 3 +1 votes.
> 
> [ ] +1 Release this package as Apache Spark 3.3.4
> [ ] -1 Do not release this package because ...
> 
> To learn more about Apache Spark, please see https://spark.apache.org/
> 
> The tag to be voted on is v3.3.4-rc1 (commit
> 18db204995b32e87a650f2f09f9bcf047ddafa90)
> https://github.com/apache/spark/tree/v3.3.4-rc1
> 
> The release files, including signatures, digests, etc. can be found at:
> 
> https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-bin/
> 
> 
> Signatures used for Spark RCs can be found in this file:
> 
> https://dist.apache.org/repos/dist/dev/spark/KEYS
> 
> 
> The staging repository for this release can be found at:
> 
> https://repository.apache.org/content/repositories/orgapachespark-1451/
> 
> 
> The documentation corresponding to this release can be found at:
> 
> https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-docs/
> 
> 
> The list of bug fixes going into 3.3.4 can be found at the following URL:
> 
> https://issues.apache.org/jira/projects/SPARK/versions/12353505
> 
> 
> This release is using the release script of the tag v3.3.4-rc1.
> 
> 
> FAQ
> 
> 
> =
> 
> How can I help test this release?
> 
> =
> 
> 
> 
> If you are a Spark user, you can help us test this release by taking
> 
> an existing Spark workload and running on this release candidate, then
> 
> reporting any regressions.
> 
> 
> 
> If you're working in PySpark you can set up a virtual env and install
> 
> the current RC and see if anything important breaks, in the Java/Scala
> 
> you can add the staging repository to your projects resolvers and test
> 
> with the RC (make sure to clean up the artifact cache before/after so
> 
> you don't end up building with a out of date RC going forward).
> 
> 
> 
> ===
> 
> What should happen to JIRA tickets still targeting 3.3.4?
> 
> ===
> 
> 
> 
> The current list of open tickets targeted at 3.3.4 can be found at:
> 
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.3.4
> 
> 
> Committers should look at those and triage. Extremely important bug
> 
> fixes, documentation, and API tweaks that impact compatibility should
> 
> be worked on immediately. Everything else please retarget to an
> 
> appropriate release.
> 
> 
> 
> ==
> 
> But my bug isn't fixed?
> 
> ==
> 
> 
> 
> In order to make timely releases, we will typically not hold the
> 
> release unless the bug in question is a regression from the previous
> 
> release. That being said, if there is something which is a regression
> 
> that has not been correctly targeted please ping me or a committer to
> 
> help target the issue.
> 

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



[VOTE] Release Spark 3.3.4 (RC1)

2023-12-08 Thread Dongjoon Hyun
Please vote on releasing the following candidate as Apache Spark version
3.3.4.

The vote is open until December 15th 1AM (PST) and passes if a majority +1
PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.3.4
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v3.3.4-rc1 (commit
18db204995b32e87a650f2f09f9bcf047ddafa90)
https://github.com/apache/spark/tree/v3.3.4-rc1

The release files, including signatures, digests, etc. can be found at:

https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-bin/


Signatures used for Spark RCs can be found in this file:

https://dist.apache.org/repos/dist/dev/spark/KEYS


The staging repository for this release can be found at:

https://repository.apache.org/content/repositories/orgapachespark-1451/


The documentation corresponding to this release can be found at:

https://dist.apache.org/repos/dist/dev/spark/v3.3.4-rc1-docs/


The list of bug fixes going into 3.3.4 can be found at the following URL:

https://issues.apache.org/jira/projects/SPARK/versions/12353505


This release is using the release script of the tag v3.3.4-rc1.


FAQ


=

How can I help test this release?

=



If you are a Spark user, you can help us test this release by taking

an existing Spark workload and running on this release candidate, then

reporting any regressions.



If you're working in PySpark you can set up a virtual env and install

the current RC and see if anything important breaks, in the Java/Scala

you can add the staging repository to your projects resolvers and test

with the RC (make sure to clean up the artifact cache before/after so

you don't end up building with a out of date RC going forward).



===

What should happen to JIRA tickets still targeting 3.3.4?

===



The current list of open tickets targeted at 3.3.4 can be found at:

https://issues.apache.org/jira/projects/SPARK and search for "Target
Version/s" = 3.3.4


Committers should look at those and triage. Extremely important bug

fixes, documentation, and API tweaks that impact compatibility should

be worked on immediately. Everything else please retarget to an

appropriate release.



==

But my bug isn't fixed?

==



In order to make timely releases, we will typically not hold the

release unless the bug in question is a regression from the previous

release. That being said, if there is something which is a regression

that has not been correctly targeted please ping me or a committer to

help target the issue.