Signatures, digests, etc check out fine - thanks for updating them !
Checked out tag and build/tested with -Phive -Pyarn -Pmesos -Pkubernetes


The test ClientE2ETestSuite.simple udf failed [1] in "Connect Client "
module ... yet to test "Spark Protobuf" module due to the failure.


Regards,
Mridul

[1]

- simple udf *** FAILED ***

  io.grpc.StatusRuntimeException: INTERNAL:
org.apache.spark.sql.ClientE2ETestSuite

  at io.grpc.Status.asRuntimeException(Status.java:535)

  at
io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)

  at org.apache.spark.sql.connect.client.SparkResult.org
$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:50)

  at
org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:95)

  at
org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:112)

  at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2037)

  at org.apache.spark.sql.Dataset.withResult(Dataset.scala:2267)

  at org.apache.spark.sql.Dataset.collect(Dataset.scala:2036)

  at
org.apache.spark.sql.ClientE2ETestSuite.$anonfun$new$5(ClientE2ETestSuite.scala:65)

  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)

  ...





On Wed, Feb 22, 2023 at 2:07 AM Mridul Muralidharan <mri...@gmail.com>
wrote:

>
> Thanks Xinrong !
> The signature verifications are fine now ... will continue with testing
> the release.
>
>
> Regards,
> Mridul
>
>
> On Wed, Feb 22, 2023 at 1:27 AM Xinrong Meng <xinrong.apa...@gmail.com>
> wrote:
>
>> Hi Mridul,
>>
>> Would you please try that again? It should work now.
>>
>> On Wed, Feb 22, 2023 at 2:04 PM Mridul Muralidharan <mri...@gmail.com>
>> wrote:
>>
>>>
>>> Hi Xinrong,
>>>
>>>   Was it signed with the same key as present in KEYS [1] ?
>>> I am seeing errors with gpg when validating. For example:
>>>
>>>
>>> $ gpg --verify pyspark-3.4.0.tar.gz.asc
>>>
>>> gpg: assuming signed data in 'pyspark-3.4.0.tar.gz'
>>>
>>> gpg: Signature made Tue 21 Feb 2023 05:56:05 AM CST
>>>
>>> gpg:                using RSA key
>>> CC68B3D16FE33A766705160BA7E57908C7A4E1B1
>>>
>>> gpg:                issuer "xinr...@apache.org"
>>>
>>> gpg: Can't check signature: No public key
>>>
>>>
>>>
>>> Regards,
>>> Mridul
>>>
>>> [1] https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>>
>>> On Tue, Feb 21, 2023 at 10:36 PM Xinrong Meng <xinrong.apa...@gmail.com>
>>> wrote:
>>>
>>>> Please vote on releasing the following candidate as Apache Spark
>>>> version 3.4.0.
>>>>
>>>> The vote is open until 11:59pm Pacific time *February 27th* and passes
>>>> if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>>>
>>>> [ ] +1 Release this package as Apache Spark 3.4.0
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>>
>>>> The tag to be voted on is *v3.4.0-rc1* (commit
>>>> e2484f626bb338274665a49078b528365ea18c3b):
>>>> https://github.com/apache/spark/tree/v3.4.0-rc1
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> https://dist.apache.org/repos/dist/dev/spark/v3.4.0-rc1-bin/
>>>>
>>>> Signatures used for Spark RCs can be found in this file:
>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>
>>>> The staging repository for this release can be found at:
>>>> https://repository.apache.org/content/repositories/orgapachespark-1435
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>> https://dist.apache.org/repos/dist/dev/spark/v3.4.0-rc1-docs/
>>>>
>>>> The list of bug fixes going into 3.4.0 can be found at the following
>>>> URL:
>>>> https://issues.apache.org/jira/projects/SPARK/versions/12351465
>>>>
>>>> This release is using the release script of the tag v3.4.0-rc1.
>>>>
>>>>
>>>> FAQ
>>>>
>>>> =========================
>>>> How can I help test this release?
>>>> =========================
>>>> If you are a Spark user, you can help us test this release by taking
>>>> an existing Spark workload and running on this release candidate, then
>>>> reporting any regressions.
>>>>
>>>> If you're working in PySpark you can set up a virtual env and install
>>>> the current RC and see if anything important breaks, in the Java/Scala
>>>> you can add the staging repository to your projects resolvers and test
>>>> with the RC (make sure to clean up the artifact cache before/after so
>>>> you don't end up building with a out of date RC going forward).
>>>>
>>>> ===========================================
>>>> What should happen to JIRA tickets still targeting 3.4.0?
>>>> ===========================================
>>>> The current list of open tickets targeted at 3.4.0 can be found at:
>>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>>> Version/s" = 3.4.0
>>>>
>>>> Committers should look at those and triage. Extremely important bug
>>>> fixes, documentation, and API tweaks that impact compatibility should
>>>> be worked on immediately. Everything else please retarget to an
>>>> appropriate release.
>>>>
>>>> ==================
>>>> But my bug isn't fixed?
>>>> ==================
>>>> In order to make timely releases, we will typically not hold the
>>>> release unless the bug in question is a regression from the previous
>>>> release. That being said, if there is something which is a regression
>>>> that has not been correctly targeted please ping me or a committer to
>>>> help target the issue.
>>>>
>>>> Thanks,
>>>> Xinrong Meng
>>>>
>>>

Reply via email to