Reverse proxy for Spark UI on Kubernetes

2022-05-16 Thread bo yang
Hi Spark Folks, I built a web reverse proxy to access Spark UI on Kubernetes (working together with https://github.com/GoogleCloudPlatform/spark-on-k8s-operator). Want to share here in case other people have similar need. The reverse proxy code is here:

Re: Introducing "Pandas API on Spark" component in JIRA, and use "PS" PR title component

2022-05-16 Thread Yikun Jiang
It's a pretty good idea, +1. To be clear in Github: - For each PR Title: [SPARK-XXX][PYTHON][PS] The Pandas on spark pr title (*still keep [PYTHON]* and [PS] new added) - For PR label: new added: `PANDAS API ON Spark`, still keep: `PYTHON`, `CORE` (*still keep `PYTHON`, `CORE`* and `PANDAS API

Re: Introducing "Pandas API on Spark" component in JIRA, and use "PS" PR title component

2022-05-16 Thread Hyukjin Kwon
Thanks Ruifeng. I added "Pandas API on Spark" component JIRA (and archived "jenkins" component since we don't have the legacy Jenkins anymore). Let me know if you guys have other opinions. On Tue, 17 May 2022 at 12:59, Ruifeng Zheng wrote: > +1, I think it is a good idea > > >

回复:Introducing "Pandas API on Spark" component in JIRA, and use "PS" PR title component

2022-05-16 Thread Ruifeng Zheng
+1, I think it is a good idea --原始邮件-- 发件人: "Hyukjin Kwon"

Introducing "Pandas API on Spark" component in JIRA, and use "PS" PR title component

2022-05-16 Thread Hyukjin Kwon
Hi all, What about we introduce a component in JIRA "Pandas API on Spark", and use "PS" (pandas-on-Spark) in PR titles? We already use "ps" in many places when we: import pyspark.pandas as ps. This is similar to "Structured Streaming" in JIRA, and "SS" in PR title. I think it'd be easier to

Re: [VOTE] Release Spark 3.3.0 (RC2)

2022-05-16 Thread Sean Owen
I'm still seeing failures related to the function registry, like: ExpressionsSchemaSuite: - Check schemas for expression examples *** FAILED *** 396 did not equal 398 Expected 396 blocks in result file but got 398. Try regenerating the result files. (ExpressionsSchemaSuite.scala:161) -

[VOTE] Release Spark 3.3.0 (RC2)

2022-05-16 Thread Maxim Gekk
Please vote on releasing the following candidate as Apache Spark version 3.3.0. The vote is open until 11:59pm Pacific time May 19th and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes. [ ] +1 Release this package as Apache Spark 3.3.0 [ ] -1 Do not release this package