+1

Thanks Dongjoon!

On Wed, Nov 29, 2023 at 7:53 PM Mridul Muralidharan <mri...@gmail.com> wrote:
>
> +1
>
> Signatures, digests, etc check out fine.
> Checked out tag and build/tested with -Phive -Pyarn -Pmesos -Pkubernetes
>
> Regards,
> Mridul
>
> On Wed, Nov 29, 2023 at 5:08 AM Yang Jie <yangji...@apache.org> wrote:
>>
>> +1(non-binding)
>>
>> Jie Yang
>>
>> On 2023/11/29 02:08:04 Kent Yao wrote:
>> > +1(non-binding)
>> >
>> > Kent Yao
>> >
>> > On 2023/11/27 01:12:53 Dongjoon Hyun wrote:
>> > > Hi, Marc.
>> > >
>> > > Given that it exists in 3.4.0 and 3.4.1, I don't think it's a release
>> > > blocker for Apache Spark 3.4.2.
>> > >
>> > > When the patch is ready, we can consider it for 3.4.3.
>> > >
>> > > In addition, note that we categorized release-blocker-level issues by
>> > > marking 'Blocker' priority with `Target Version` before the vote.
>> > >
>> > > Best,
>> > > Dongjoon.
>> > >
>> > >
>> > > On Sat, Nov 25, 2023 at 12:01 PM Marc Le Bihan <mlebiha...@gmail.com> 
>> > > wrote:
>> > >
>> > > > -1 If you can wait that the last remaining problem with Generics (?) is
>> > > > entirely solved, that causes this exception to be thrown :
>> > > >
>> > > > java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast 
>> > > > to class [Ljava.lang.reflect.TypeVariable; ([Ljava.lang.Object; and 
>> > > > [Ljava.lang.reflect.TypeVariable; are in module java.base of loader 
>> > > > 'bootstrap')
>> > > >     at 
>> > > > org.apache.spark.sql.catalyst.JavaTypeInference$.encoderFor(JavaTypeInference.scala:116)
>> > > >     at 
>> > > > org.apache.spark.sql.catalyst.JavaTypeInference$.$anonfun$encoderFor$1(JavaTypeInference.scala:140)
>> > > >     at scala.collection.ArrayOps$.map$extension(ArrayOps.scala:929)
>> > > >     at 
>> > > > org.apache.spark.sql.catalyst.JavaTypeInference$.encoderFor(JavaTypeInference.scala:138)
>> > > >     at 
>> > > > org.apache.spark.sql.catalyst.JavaTypeInference$.encoderFor(JavaTypeInference.scala:60)
>> > > >     at 
>> > > > org.apache.spark.sql.catalyst.JavaTypeInference$.encoderFor(JavaTypeInference.scala:53)
>> > > >     at 
>> > > > org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.javaBean(ExpressionEncoder.scala:62)
>> > > >     at org.apache.spark.sql.Encoders$.bean(Encoders.scala:179)
>> > > >     at org.apache.spark.sql.Encoders.bean(Encoders.scala)
>> > > >
>> > > >
>> > > > https://issues.apache.org/jira/browse/SPARK-45311
>> > > >
>> > > > Thanks !
>> > > >
>> > > > Marc Le Bihan
>> > > >
>> > > >
>> > > > On 25/11/2023 11:48, Dongjoon Hyun wrote:
>> > > >
>> > > > Please vote on releasing the following candidate as Apache Spark 
>> > > > version
>> > > > 3.4.2.
>> > > >
>> > > > The vote is open until November 30th 1AM (PST) and passes if a 
>> > > > majority +1
>> > > > PMC votes are cast, with a minimum of 3 +1 votes.
>> > > >
>> > > > [ ] +1 Release this package as Apache Spark 3.4.2
>> > > > [ ] -1 Do not release this package because ...
>> > > >
>> > > > To learn more about Apache Spark, please see https://spark.apache.org/
>> > > >
>> > > > The tag to be voted on is v3.4.2-rc1 (commit
>> > > > 0c0e7d4087c64efca259b4fb656b8be643be5686)
>> > > > https://github.com/apache/spark/tree/v3.4.2-rc1
>> > > >
>> > > > The release files, including signatures, digests, etc. can be found at:
>> > > > https://dist.apache.org/repos/dist/dev/spark/v3.4.2-rc1-bin/
>> > > >
>> > > > Signatures used for Spark RCs can be found in this file:
>> > > > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> > > >
>> > > > The staging repository for this release can be found at:
>> > > > https://repository.apache.org/content/repositories/orgapachespark-1450/
>> > > >
>> > > > The documentation corresponding to this release can be found at:
>> > > > https://dist.apache.org/repos/dist/dev/spark/v3.4.2-rc1-docs/
>> > > >
>> > > > The list of bug fixes going into 3.4.2 can be found at the following 
>> > > > URL:
>> > > > https://issues.apache.org/jira/projects/SPARK/versions/12353368
>> > > >
>> > > > This release is using the release script of the tag v3.4.2-rc1.
>> > > >
>> > > > FAQ
>> > > >
>> > > > =========================
>> > > > How can I help test this release?
>> > > > =========================
>> > > >
>> > > > If you are a Spark user, you can help us test this release by taking
>> > > > an existing Spark workload and running on this release candidate, then
>> > > > reporting any regressions.
>> > > >
>> > > > If you're working in PySpark you can set up a virtual env and install
>> > > > the current RC and see if anything important breaks, in the Java/Scala
>> > > > you can add the staging repository to your projects resolvers and test
>> > > > with the RC (make sure to clean up the artifact cache before/after so
>> > > > you don't end up building with a out of date RC going forward).
>> > > >
>> > > > ===========================================
>> > > > What should happen to JIRA tickets still targeting 3.4.2?
>> > > > ===========================================
>> > > >
>> > > > The current list of open tickets targeted at 3.4.2 can be found at:
>> > > > https://issues.apache.org/jira/projects/SPARK and search for "Target
>> > > > Version/s" = 3.4.2
>> > > >
>> > > > Committers should look at those and triage. Extremely important bug
>> > > > fixes, documentation, and API tweaks that impact compatibility should
>> > > > be worked on immediately. Everything else please retarget to an
>> > > > appropriate release.
>> > > >
>> > > > ==================
>> > > > But my bug isn't fixed?
>> > > > ==================
>> > > >
>> > > > In order to make timely releases, we will typically not hold the
>> > > > release unless the bug in question is a regression from the previous
>> > > > release. That being said, if there is something which is a regression
>> > > > that has not been correctly targeted please ping me or a committer to
>> > > > help target the issue.
>> > > >
>> > > >
>> > > >
>> > >
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to