I noticed that `TimestampNTZ` is intentionally hidden from the doc.
https://github.com/apache/spark/pull/35313#issuecomment-1185194701

So, it's better to remove notes about TimestampNTZ from the doc.
But I don't think this issue is not a blocker, so +1 on this RC.

Kousuke

Hi Bruce,

FYI we had further discussions on
https://github.com/apache/spark/pull/35313#issuecomment-1185195455.
Thanks for pointing that out, but this document issue should not be a
blocker of the release.

+1 on the RC.

Gengliang

On Thu, Jul 14, 2022 at 10:22 PM sarutak <saru...@oss.nttdata.com>
wrote:

Hi Dongjoon and Bruce,

SPARK-36724 is about SessionWindow, while SPARK-38017 and PR #35313
are
about TimeWindow, and TimeWindow already supports TimestampNTZ in
v3.2.1.


https://github.com/apache/spark/blob/4f25b3f71238a00508a356591553f2dfa89f8290/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TimeWindow.scala#L99

So, I think that change still valid.

Kousuke

Thank you so much, Bruce.

After SPARK-36724 landed at Spark 3.3.0, SPARK-38017 seems to land
at
branch-3.2 mistakenly here.

https://github.com/apache/spark/pull/35313

I believe I can remove those four places after uploading the docs
to
our website.

Dongjoon.

On Thu, Jul 14, 2022 at 2:16 PM Bruce Robbins
<bersprock...@gmail.com>
wrote:

A small thing. The function API doc (here [1]) claims that the
window function accepts a timeColumn of TimestampType or
TimestampNTZType. The update to the API doc was made since
v3.2.1.

As far as I can tell, 3.2.2 doesn't support TimestampNTZType.

On Mon, Jul 11, 2022 at 2:58 PM Dongjoon Hyun
<dongjoon.h...@gmail.com> wrote:

Please vote on releasing the following candidate as Apache Spark
version 3.2.2.

The vote is open until July 15th 1AM (PST) and passes if a
majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.2.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see
https://spark.apache.org/

The tag to be voted on is v3.2.2-rc1 (commit
78a5825fe266c0884d2dd18cbca9625fa258d7f7):
https://github.com/apache/spark/tree/v3.2.2-rc1

The release files, including signatures, digests, etc. can be
found at:
https://dist.apache.org/repos/dist/dev/spark/v3.2.2-rc1-bin/

Signatures used for Spark RCs can be found in this file:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:




https://repository.apache.org/content/repositories/orgapachespark-1409/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v3.2.2-rc1-docs/

The list of bug fixes going into 3.2.2 can be found at the
following URL:
https://issues.apache.org/jira/projects/SPARK/versions/12351232

This release is using the release script of the tag v3.2.2-rc1.

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by
taking
an existing Spark workload and running on this release
candidate,
then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and
install
the current RC and see if anything important breaks, in the
Java/Scala
you can add the staging repository to your projects resolvers
and
test
with the RC (make sure to clean up the artifact cache
before/after
so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.2.2?
===========================================

The current list of open tickets targeted at 3.2.2 can be found
at:
https://issues.apache.org/jira/projects/SPARK and search for
"Target Version/s" = 3.2.2

Committers should look at those and triage. Extremely important
bug
fixes, documentation, and API tweaks that impact compatibility
should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the
previous
release. That being said, if there is something which is a
regression
that has not been correctly targeted please ping me or a
committer
to
help target the issue.

Dongjoon


Links:
------
[1]


https://dist.apache.org/repos/dist/dev/spark/v3.2.2-rc1-docs/_site/api/scala/org/apache/spark/sql/functions$.html


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to