[VOTE][RESULT] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
Hi all, The vote passes with 7+1s (5 binding +1s). (* = binding) +1: Dongjoon Hyun(*) Liang-Chi Hsieh(*) Huaxin Gao(*) Bo Yang Xiao Li(*) Chao Sun(*) Hussein Awala +0: None -1: None Thanks. - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: [VOTE] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
+1 (non-binding) to using an independent version for the Spark Kubernetes Operator with a compatibility matrix with Spark versions. On Fri, Apr 12, 2024 at 5:31 AM L. C. Hsieh wrote: > Hi all, > > Thanks for all discussions in the thread of "Versioning of Spark > Operator": > https://lists.apache.org/thread/zhc7nb2sxm8jjxdppq8qjcmlf4rcsthh > > I would like to create this vote to get the consensus for versioning > of the Spark Kubernetes Operator. > > The proposal is to use an independent versioning for the Spark > Kubernetes Operator. > > Please vote on adding new `Versions` in Apache Spark JIRA which can be > used for places like "Fix Version/s" in the JIRA tickets of the > operator. > > The new `Versions` will be `kubernetes-operator-` prefix, for example > `kubernetes-operator-0.1.0`. > > The vote is open until April 15th 1AM (PST) and passes if a majority > +1 PMC votes are cast, with a minimum of 3 +1 votes. > > [ ] +1 Adding the new `Versions` for Spark Kubernetes Operator in > Apache Spark JIRA > [ ] -1 Do not add the new `Versions` because ... > > Thank you. > > > Note that this is not a SPIP vote and also not a release vote. I don't > find similar votes in previous threads. This is made similarly like a > SPIP or a release vote. So I think it should be okay. Please correct > me if this vote format is not good for you. > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >
Re: [VOTE] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
+1 On Fri, Apr 12, 2024 at 4:23 PM Xiao Li wrote: > +1 > > > > > On Fri, Apr 12, 2024 at 14:30 bo yang wrote: > >> +1 >> > >> On Fri, Apr 12, 2024 at 12:34 PM huaxin gao >> wrote: >> >>> +1 >>> >>> On Fri, Apr 12, 2024 at 9:07 AM Dongjoon Hyun >>> wrote: >>> >>>> +1 >>>> >>>> Thank you! >>>> >>>> I hope we can customize `dev/merge_spark_pr.py` script per repository >>>> after this PR. >>>> >>>> Dongjoon. >>>> >>>> On 2024/04/12 03:28:36 "L. C. Hsieh" wrote: >>>> > Hi all, >>>> > >>>> > Thanks for all discussions in the thread of "Versioning of Spark >>>> > Operator": >>>> https://lists.apache.org/thread/zhc7nb2sxm8jjxdppq8qjcmlf4rcsthh >>>> > >>>> > I would like to create this vote to get the consensus for versioning >>>> > of the Spark Kubernetes Operator. >>>> > >>>> > The proposal is to use an independent versioning for the Spark >>>> > Kubernetes Operator. >>>> > >>>> > Please vote on adding new `Versions` in Apache Spark JIRA which can be >>>> > used for places like "Fix Version/s" in the JIRA tickets of the >>>> > operator. >>>> > >>>> > The new `Versions` will be `kubernetes-operator-` prefix, for example >>>> > `kubernetes-operator-0.1.0`. >>>> > >>>> > The vote is open until April 15th 1AM (PST) and passes if a majority >>>> > +1 PMC votes are cast, with a minimum of 3 +1 votes. >>>> > >>>> > [ ] +1 Adding the new `Versions` for Spark Kubernetes Operator in >>>> > Apache Spark JIRA >>>> > [ ] -1 Do not add the new `Versions` because ... >>>> > >>>> > Thank you. >>>> > >>>> > >>>> > Note that this is not a SPIP vote and also not a release vote. I don't >>>> > find similar votes in previous threads. This is made similarly like a >>>> > SPIP or a release vote. So I think it should be okay. Please correct >>>> > me if this vote format is not good for you. >>>> > >>>> > - >>>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >>>> > >>>> > >>>> >>>> - >>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >>>> >>>>
Re: [VOTE] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
+1 On Fri, Apr 12, 2024 at 14:30 bo yang wrote: > +1 > > On Fri, Apr 12, 2024 at 12:34 PM huaxin gao > wrote: > >> +1 >> >> On Fri, Apr 12, 2024 at 9:07 AM Dongjoon Hyun >> wrote: >> >>> +1 >>> >>> Thank you! >>> >>> I hope we can customize `dev/merge_spark_pr.py` script per repository >>> after this PR. >>> >>> Dongjoon. >>> >>> On 2024/04/12 03:28:36 "L. C. Hsieh" wrote: >>> > Hi all, >>> > >>> > Thanks for all discussions in the thread of "Versioning of Spark >>> > Operator": >>> https://lists.apache.org/thread/zhc7nb2sxm8jjxdppq8qjcmlf4rcsthh >>> > >>> > I would like to create this vote to get the consensus for versioning >>> > of the Spark Kubernetes Operator. >>> > >>> > The proposal is to use an independent versioning for the Spark >>> > Kubernetes Operator. >>> > >>> > Please vote on adding new `Versions` in Apache Spark JIRA which can be >>> > used for places like "Fix Version/s" in the JIRA tickets of the >>> > operator. >>> > >>> > The new `Versions` will be `kubernetes-operator-` prefix, for example >>> > `kubernetes-operator-0.1.0`. >>> > >>> > The vote is open until April 15th 1AM (PST) and passes if a majority >>> > +1 PMC votes are cast, with a minimum of 3 +1 votes. >>> > >>> > [ ] +1 Adding the new `Versions` for Spark Kubernetes Operator in >>> > Apache Spark JIRA >>> > [ ] -1 Do not add the new `Versions` because ... >>> > >>> > Thank you. >>> > >>> > >>> > Note that this is not a SPIP vote and also not a release vote. I don't >>> > find similar votes in previous threads. This is made similarly like a >>> > SPIP or a release vote. So I think it should be okay. Please correct >>> > me if this vote format is not good for you. >>> > >>> > - >>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >>> > >>> > >>> >>> - >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >>> >>>
Re: [VOTE] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
+1 On Fri, Apr 12, 2024 at 12:34 PM huaxin gao wrote: > +1 > > On Fri, Apr 12, 2024 at 9:07 AM Dongjoon Hyun wrote: > >> +1 >> >> Thank you! >> >> I hope we can customize `dev/merge_spark_pr.py` script per repository >> after this PR. >> >> Dongjoon. >> >> On 2024/04/12 03:28:36 "L. C. Hsieh" wrote: >> > Hi all, >> > >> > Thanks for all discussions in the thread of "Versioning of Spark >> > Operator": >> https://lists.apache.org/thread/zhc7nb2sxm8jjxdppq8qjcmlf4rcsthh >> > >> > I would like to create this vote to get the consensus for versioning >> > of the Spark Kubernetes Operator. >> > >> > The proposal is to use an independent versioning for the Spark >> > Kubernetes Operator. >> > >> > Please vote on adding new `Versions` in Apache Spark JIRA which can be >> > used for places like "Fix Version/s" in the JIRA tickets of the >> > operator. >> > >> > The new `Versions` will be `kubernetes-operator-` prefix, for example >> > `kubernetes-operator-0.1.0`. >> > >> > The vote is open until April 15th 1AM (PST) and passes if a majority >> > +1 PMC votes are cast, with a minimum of 3 +1 votes. >> > >> > [ ] +1 Adding the new `Versions` for Spark Kubernetes Operator in >> > Apache Spark JIRA >> > [ ] -1 Do not add the new `Versions` because ... >> > >> > Thank you. >> > >> > >> > Note that this is not a SPIP vote and also not a release vote. I don't >> > find similar votes in previous threads. This is made similarly like a >> > SPIP or a release vote. So I think it should be okay. Please correct >> > me if this vote format is not good for you. >> > >> > - >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >> > >> > >> >> - >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >> >>
Re: [VOTE] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
+1 Thank you, Dongjoon. Yea, We may need to customize the merge script for a particular repository. On Fri, Apr 12, 2024 at 9:07 AM Dongjoon Hyun wrote: > > +1 > > Thank you! > > I hope we can customize `dev/merge_spark_pr.py` script per repository after > this PR. > > Dongjoon. > > On 2024/04/12 03:28:36 "L. C. Hsieh" wrote: > > Hi all, > > > > Thanks for all discussions in the thread of "Versioning of Spark > > Operator": https://lists.apache.org/thread/zhc7nb2sxm8jjxdppq8qjcmlf4rcsthh > > > > I would like to create this vote to get the consensus for versioning > > of the Spark Kubernetes Operator. > > > > The proposal is to use an independent versioning for the Spark > > Kubernetes Operator. > > > > Please vote on adding new `Versions` in Apache Spark JIRA which can be > > used for places like "Fix Version/s" in the JIRA tickets of the > > operator. > > > > The new `Versions` will be `kubernetes-operator-` prefix, for example > > `kubernetes-operator-0.1.0`. > > > > The vote is open until April 15th 1AM (PST) and passes if a majority > > +1 PMC votes are cast, with a minimum of 3 +1 votes. > > > > [ ] +1 Adding the new `Versions` for Spark Kubernetes Operator in > > Apache Spark JIRA > > [ ] -1 Do not add the new `Versions` because ... > > > > Thank you. > > > > > > Note that this is not a SPIP vote and also not a release vote. I don't > > find similar votes in previous threads. This is made similarly like a > > SPIP or a release vote. So I think it should be okay. Please correct > > me if this vote format is not good for you. > > > > - > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > > > > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: [VOTE] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
+1 On Fri, Apr 12, 2024 at 9:07 AM Dongjoon Hyun wrote: > +1 > > Thank you! > > I hope we can customize `dev/merge_spark_pr.py` script per repository > after this PR. > > Dongjoon. > > On 2024/04/12 03:28:36 "L. C. Hsieh" wrote: > > Hi all, > > > > Thanks for all discussions in the thread of "Versioning of Spark > > Operator": > https://lists.apache.org/thread/zhc7nb2sxm8jjxdppq8qjcmlf4rcsthh > > > > I would like to create this vote to get the consensus for versioning > > of the Spark Kubernetes Operator. > > > > The proposal is to use an independent versioning for the Spark > > Kubernetes Operator. > > > > Please vote on adding new `Versions` in Apache Spark JIRA which can be > > used for places like "Fix Version/s" in the JIRA tickets of the > > operator. > > > > The new `Versions` will be `kubernetes-operator-` prefix, for example > > `kubernetes-operator-0.1.0`. > > > > The vote is open until April 15th 1AM (PST) and passes if a majority > > +1 PMC votes are cast, with a minimum of 3 +1 votes. > > > > [ ] +1 Adding the new `Versions` for Spark Kubernetes Operator in > > Apache Spark JIRA > > [ ] -1 Do not add the new `Versions` because ... > > > > Thank you. > > > > > > Note that this is not a SPIP vote and also not a release vote. I don't > > find similar votes in previous threads. This is made similarly like a > > SPIP or a release vote. So I think it should be okay. Please correct > > me if this vote format is not good for you. > > > > - > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > > > > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >
Re: [VOTE] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
+1 Thank you! I hope we can customize `dev/merge_spark_pr.py` script per repository after this PR. Dongjoon. On 2024/04/12 03:28:36 "L. C. Hsieh" wrote: > Hi all, > > Thanks for all discussions in the thread of "Versioning of Spark > Operator": https://lists.apache.org/thread/zhc7nb2sxm8jjxdppq8qjcmlf4rcsthh > > I would like to create this vote to get the consensus for versioning > of the Spark Kubernetes Operator. > > The proposal is to use an independent versioning for the Spark > Kubernetes Operator. > > Please vote on adding new `Versions` in Apache Spark JIRA which can be > used for places like "Fix Version/s" in the JIRA tickets of the > operator. > > The new `Versions` will be `kubernetes-operator-` prefix, for example > `kubernetes-operator-0.1.0`. > > The vote is open until April 15th 1AM (PST) and passes if a majority > +1 PMC votes are cast, with a minimum of 3 +1 votes. > > [ ] +1 Adding the new `Versions` for Spark Kubernetes Operator in > Apache Spark JIRA > [ ] -1 Do not add the new `Versions` because ... > > Thank you. > > > Note that this is not a SPIP vote and also not a release vote. I don't > find similar votes in previous threads. This is made similarly like a > SPIP or a release vote. So I think it should be okay. Please correct > me if this vote format is not good for you. > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
[VOTE] Add new `Versions` in Apache Spark JIRA for Versioning of Spark Operator
Hi all, Thanks for all discussions in the thread of "Versioning of Spark Operator": https://lists.apache.org/thread/zhc7nb2sxm8jjxdppq8qjcmlf4rcsthh I would like to create this vote to get the consensus for versioning of the Spark Kubernetes Operator. The proposal is to use an independent versioning for the Spark Kubernetes Operator. Please vote on adding new `Versions` in Apache Spark JIRA which can be used for places like "Fix Version/s" in the JIRA tickets of the operator. The new `Versions` will be `kubernetes-operator-` prefix, for example `kubernetes-operator-0.1.0`. The vote is open until April 15th 1AM (PST) and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes. [ ] +1 Adding the new `Versions` for Spark Kubernetes Operator in Apache Spark JIRA [ ] -1 Do not add the new `Versions` because ... Thank you. Note that this is not a SPIP vote and also not a release vote. I don't find similar votes in previous threads. This is made similarly like a SPIP or a release vote. So I think it should be okay. Please correct me if this vote format is not good for you. - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: [External] Re: Versioning of Spark Operator
A related question - what is the expected release cadence? At least for the next 12-18 months? Since this is a new subproject, I am personally hoping it would have a faster cadence at first, maybe one a month or once every couple of months... If so, that would affect versioning. Also, if it uses semantic versioning, since it is early for the subproject it might have a few releases with breaking changes until its own API, defaults, behavior becomes stable, so again, having its own versioning might help. Just my two cents, Ofir From: L. C. Hsieh Sent: Wednesday, April 10, 2024 6:14 PM To: Dongjoon Hyun Cc: dev@spark.apache.org Subject: [External] Re: Versioning of Spark Operator This approach makes sense to me. If Spark K8s operator is aligned with Spark versions, for example, it uses 4.0.0 now. Because these JIRA tickets are not actually targeting Spark 4.0.0, it will cause confusion and more questions, like when we are going to cut Spark release, should we include Spark operator JIRAs in the release note, etc. So I think an independent version number for Spark K8s operator would be a better option. If there are no more options or comments, I will create a vote later to create new "Versions" in Apache Spark JIRA. Thank you all. On Wed, Apr 10, 2024 at 12:20 AM Dongjoon Hyun wrote: > > Ya, that would work. > > Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo. > > It looks reasonable to me. > > Although they share the same JIRA, they choose different patterns per place. > > 1. In POM file and Maven Artifact, independent version number. > 1.8.0 > > 2. Tag is also based on the independent version number > https://github.com/apache/flink-kubernetes-operator/tags > - release-1.8.0 > - release-1.7.0 > > 3. JIRA Fixed Version is `kubernetes-operator-` prefix. > https://issues.apache.org/jira/browse/FLINK-34957 > > Fix Version/s: kubernetes-operator-1.9.0 > > Maybe, we can borrow this pattern. > > I guess we need a vote for any further decision because we need to create new > `Versions` in Apache Spark JIRA. > > Dongjoon. > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: Versioning of Spark Operator
This approach makes sense to me. If Spark K8s operator is aligned with Spark versions, for example, it uses 4.0.0 now. Because these JIRA tickets are not actually targeting Spark 4.0.0, it will cause confusion and more questions, like when we are going to cut Spark release, should we include Spark operator JIRAs in the release note, etc. So I think an independent version number for Spark K8s operator would be a better option. If there are no more options or comments, I will create a vote later to create new "Versions" in Apache Spark JIRA. Thank you all. On Wed, Apr 10, 2024 at 12:20 AM Dongjoon Hyun wrote: > > Ya, that would work. > > Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo. > > It looks reasonable to me. > > Although they share the same JIRA, they choose different patterns per place. > > 1. In POM file and Maven Artifact, independent version number. > 1.8.0 > > 2. Tag is also based on the independent version number > https://github.com/apache/flink-kubernetes-operator/tags > - release-1.8.0 > - release-1.7.0 > > 3. JIRA Fixed Version is `kubernetes-operator-` prefix. > https://issues.apache.org/jira/browse/FLINK-34957 > > Fix Version/s: kubernetes-operator-1.9.0 > > Maybe, we can borrow this pattern. > > I guess we need a vote for any further decision because we need to create new > `Versions` in Apache Spark JIRA. > > Dongjoon. > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: Versioning of Spark Operator
Cool, looks like we have two options here. Option 1: Spark Operator and Connect Go Client versioning independent of Spark, e.g. starting with 0.1.0. Pros: they can evolve versions independently. Cons: people will need an extra step to decide the version when using Spark Operator and Connect Go Client. Option 2: Spark Operator and Connect Go Client versioning loosely related with Spark, e.g. starting with the Supported Spark version Pros: might be easy for beginning users to choose version when using Spark Operator and Connect Go Client. Cons: there is uncertainty how the compatibility will go in the future for Spark Operator and Connect Go Client regarding Spark, which may impact this version naming. Right now, Connect Go Client uses Option 2, but can change to Option 1 if needed. On Wed, Apr 10, 2024 at 6:19 AM Dongjoon Hyun wrote: > Ya, that would work. > > Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo. > > It looks reasonable to me. > > Although they share the same JIRA, they choose different patterns per > place. > > 1. In POM file and Maven Artifact, independent version number. > 1.8.0 > > 2. Tag is also based on the independent version number > https://github.com/apache/flink-kubernetes-operator/tags > - release-1.8.0 > - release-1.7.0 > > 3. JIRA Fixed Version is `kubernetes-operator-` prefix. > https://issues.apache.org/jira/browse/FLINK-34957 > > Fix Version/s: kubernetes-operator-1.9.0 > > Maybe, we can borrow this pattern. > > I guess we need a vote for any further decision because we need to create > new `Versions` in Apache Spark JIRA. > > Dongjoon. > >
Re: Versioning of Spark Operator
Ya, that would work. Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo. It looks reasonable to me. Although they share the same JIRA, they choose different patterns per place. 1. In POM file and Maven Artifact, independent version number. 1.8.0 2. Tag is also based on the independent version number https://github.com/apache/flink-kubernetes-operator/tags - release-1.8.0 - release-1.7.0 3. JIRA Fixed Version is `kubernetes-operator-` prefix. https://issues.apache.org/jira/browse/FLINK-34957 > Fix Version/s: kubernetes-operator-1.9.0 Maybe, we can borrow this pattern. I guess we need a vote for any further decision because we need to create new `Versions` in Apache Spark JIRA. Dongjoon.
Re: Versioning of Spark Operator
Yea, I guess, for example, the first release of Spark K8s Operator would be something like 0.1.0 instead of 4.0.0. It sounds hard to align with Spark versions because of that? On Tue, Apr 9, 2024 at 10:15 AM Dongjoon Hyun wrote: > > Ya, that's simple and possible. > > However, it may cause many confusions because it implies that new `Spark K8s > Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic > Versioning` policy like Apache Spark 4.0.0. > > In addition, `Versioning` is directly related to the Release Cadence. It's > unlikely for us to have `Spark K8s Operator` and `Spark Connect Go` releases > at every Apache Spark maintenance release. For example, there is no commit in > Spark Connect Go repository. > > I believe the versioning and release cadence is related to those subprojects' > maturity more. > > Dongjoon. > > On 2024/04/09 16:59:40 DB Tsai wrote: > > Aligning with Spark releases is sensible, as it allows us to guarantee > > that the Spark operator functions correctly with the new version while also > > maintaining support for previous versions. > > > > DB Tsai | https://www.dbtsai.com/ | PGP 42E5B25A8F7A82C1 > > > > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan wrote: > > > > > > > > > I am trying to understand if we can simply align with Spark's version > > > for this ? > > > Makes the release and jira management much more simpler for developers > > > and intuitive for users. > > > > > > Regards, > > > Mridul > > > > > > > > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun > > <mailto:dongj...@apache.org>> wrote: > > >> Hi, Liang-Chi. > > >> > > >> Thank you for leading Apache Spark K8s operator as a shepherd. > > >> > > >> I took a look at `Apache Spark Connect Go` repo mentioned in the thread. > > >> Sadly, there is no release at all and no activity since last 6 months. > > >> It seems to be the first time for Apache Spark community to consider > > >> these sister repositories (Go and K8s Operator). > > >> > > >> https://github.com/apache/spark-connect-go/commits/master/ > > >> > > >> Dongjoon. > > >> > > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote: > > >> > Hi all, > > >> > > > >> > We've opened the dedicated repository of Spark Kubernetes Operator, > > >> > and the first PR is created. > > >> > Thank you for the review from the community so far. > > >> > > > >> > About the versioning of Spark Operator, there are questions. > > >> > > > >> > As we are using Spark JIRA, when we are going to merge PRs, we need to > > >> > choose a Spark version. However, the Spark Operator is versioning > > >> > differently than Spark. I'm wondering how we deal with this? > > >> > > > >> > Not sure if Connect also has its versioning different to Spark? If so, > > >> > maybe we can follow how Connect does. > > >> > > > >> > Can someone who is familiar with Connect versioning give some > > >> > suggestions? > > >> > > > >> > Thank you. > > >> > > > >> > Liang-Chi > > >> > > > >> > - > > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > >> > <mailto:dev-unsubscr...@spark.apache.org> > > >> > > > >> > > > >> > > >> - > > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > >> <mailto:dev-unsubscr...@spark.apache.org> > > >> > > > > > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: Versioning of Spark Operator
For Spark Operator, I think the answer is yes. According to my impression, Spark Operator should be Spark version-agnostic. Zhou, please correct me if I'm wrong. I am not sure about the Spark Connector Go client, but if it is going to talk with Spark cluster, I guess it should be still related to Spark version (there is compatible issue). > On 2024/04/09 21:35:45 bo yang wrote: > > Thanks Liang-Chi for the Spark Operator work, and also the discussion here! > > > > For Spark Operator and Connector Go Client, I am guessing they need to > > support multiple versions of Spark? e.g. same Spark Operator may support > > running multiple versions of Spark, and Connector Go Client might support > > multiple versions of Spark driver as well. > > > > How do people think of using the minimum supported Spark version as the > > version name for Spark Operator and Connector Go Client? For example, > > Spark Operator 3.5.x supports Spark 3.5 and above. > > > > Best, > > Bo > > > > > > On Tue, Apr 9, 2024 at 10:14 AM Dongjoon Hyun wrote: > > > > > Ya, that's simple and possible. > > > > > > However, it may cause many confusions because it implies that new `Spark > > > K8s Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic > > > Versioning` policy like Apache Spark 4.0.0. > > > > > > In addition, `Versioning` is directly related to the Release Cadence. It's > > > unlikely for us to have `Spark K8s Operator` and `Spark Connect Go` > > > releases at every Apache Spark maintenance release. For example, there is > > > no commit in Spark Connect Go repository. > > > > > > I believe the versioning and release cadence is related to those > > > subprojects' maturity more. > > > > > > Dongjoon. > > > > > > On 2024/04/09 16:59:40 DB Tsai wrote: > > > > Aligning with Spark releases is sensible, as it allows us to guarantee > > > that the Spark operator functions correctly with the new version while > > > also > > > maintaining support for previous versions. > > > > > > > > DB Tsai | https://www.dbtsai.com/ | PGP 42E5B25A8F7A82C1 > > > > > > > > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan > > > wrote: > > > > > > > > > > > > > > > I am trying to understand if we can simply align with Spark's > > > version for this ? > > > > > Makes the release and jira management much more simpler for developers > > > and intuitive for users. > > > > > > > > > > Regards, > > > > > Mridul > > > > > > > > > > > > > > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun > > <mailto:dongj...@apache.org>> wrote: > > > > >> Hi, Liang-Chi. > > > > >> > > > > >> Thank you for leading Apache Spark K8s operator as a shepherd. > > > > >> > > > > >> I took a look at `Apache Spark Connect Go` repo mentioned in the > > > thread. Sadly, there is no release at all and no activity since last 6 > > > months. It seems to be the first time for Apache Spark community to > > > consider these sister repositories (Go and K8s Operator). > > > > >> > > > > >> https://github.com/apache/spark-connect-go/commits/master/ > > > > >> > > > > >> Dongjoon. > > > > >> > > > > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote: > > > > >> > Hi all, > > > > >> > > > > > >> > We've opened the dedicated repository of Spark Kubernetes Operator, > > > > >> > and the first PR is created. > > > > >> > Thank you for the review from the community so far. > > > > >> > > > > > >> > About the versioning of Spark Operator, there are questions. > > > > >> > > > > > >> > As we are using Spark JIRA, when we are going to merge PRs, we need > > > to > > > > >> > choose a Spark version. However, the Spark Operator is versioning > > > > >> > differently than Spark. I'm wondering how we deal with this? > > > > >> > > > > > >> > Not sure if Connect also has its versioning different to Spark? If > > > so, > > > > >> > maybe we can follow how Connect does. > > > > >> > > > > > >> > Can someone who is familiar with Connect versioning give some > > > suggestions? > > > > >> > > > > > >> > Thank you. > > > > >> > > > > > >> > Liang-Chi > > > > >> > > > > > >> > > > > - > > > > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > dev-unsubscr...@spark.apache.org> > > > > >> > > > > > >> > > > > > >> > > > > >> - > > > > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > dev-unsubscr...@spark.apache.org> > > > > >> > > > > > > > > > > > > > > - > > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > > > > > > > > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: Versioning of Spark Operator
Do we have a compatibility matrix of Apache Connect Go client already, Bo? Specifically, I'm wondering which versions the existing Apache Spark Connect Go repository is able to support as of now. We know that it is supposed to be compatible always, but do we have a way to verify that actually via CI to make it sure inside Go repository? Dongjoon. On 2024/04/09 21:35:45 bo yang wrote: > Thanks Liang-Chi for the Spark Operator work, and also the discussion here! > > For Spark Operator and Connector Go Client, I am guessing they need to > support multiple versions of Spark? e.g. same Spark Operator may support > running multiple versions of Spark, and Connector Go Client might support > multiple versions of Spark driver as well. > > How do people think of using the minimum supported Spark version as the > version name for Spark Operator and Connector Go Client? For example, > Spark Operator 3.5.x supports Spark 3.5 and above. > > Best, > Bo > > > On Tue, Apr 9, 2024 at 10:14 AM Dongjoon Hyun wrote: > > > Ya, that's simple and possible. > > > > However, it may cause many confusions because it implies that new `Spark > > K8s Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic > > Versioning` policy like Apache Spark 4.0.0. > > > > In addition, `Versioning` is directly related to the Release Cadence. It's > > unlikely for us to have `Spark K8s Operator` and `Spark Connect Go` > > releases at every Apache Spark maintenance release. For example, there is > > no commit in Spark Connect Go repository. > > > > I believe the versioning and release cadence is related to those > > subprojects' maturity more. > > > > Dongjoon. > > > > On 2024/04/09 16:59:40 DB Tsai wrote: > > > Aligning with Spark releases is sensible, as it allows us to guarantee > > that the Spark operator functions correctly with the new version while also > > maintaining support for previous versions. > > > > > > DB Tsai | https://www.dbtsai.com/ | PGP 42E5B25A8F7A82C1 > > > > > > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan > > wrote: > > > > > > > > > > > > I am trying to understand if we can simply align with Spark's > > version for this ? > > > > Makes the release and jira management much more simpler for developers > > and intuitive for users. > > > > > > > > Regards, > > > > Mridul > > > > > > > > > > > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun > <mailto:dongj...@apache.org>> wrote: > > > >> Hi, Liang-Chi. > > > >> > > > >> Thank you for leading Apache Spark K8s operator as a shepherd. > > > >> > > > >> I took a look at `Apache Spark Connect Go` repo mentioned in the > > thread. Sadly, there is no release at all and no activity since last 6 > > months. It seems to be the first time for Apache Spark community to > > consider these sister repositories (Go and K8s Operator). > > > >> > > > >> https://github.com/apache/spark-connect-go/commits/master/ > > > >> > > > >> Dongjoon. > > > >> > > > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote: > > > >> > Hi all, > > > >> > > > > >> > We've opened the dedicated repository of Spark Kubernetes Operator, > > > >> > and the first PR is created. > > > >> > Thank you for the review from the community so far. > > > >> > > > > >> > About the versioning of Spark Operator, there are questions. > > > >> > > > > >> > As we are using Spark JIRA, when we are going to merge PRs, we need > > to > > > >> > choose a Spark version. However, the Spark Operator is versioning > > > >> > differently than Spark. I'm wondering how we deal with this? > > > >> > > > > >> > Not sure if Connect also has its versioning different to Spark? If > > so, > > > >> > maybe we can follow how Connect does. > > > >> > > > > >> > Can someone who is familiar with Connect versioning give some > > suggestions? > > > >> > > > > >> > Thank you. > > > >> > > > > >> > Liang-Chi > > > >> > > > > >> > > > - > > > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > dev-unsubscr...@spark.apache.org> > > > >> > > > > >> > > > > >> > > > >> - > > > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > dev-unsubscr...@spark.apache.org> > > > >> > > > > > > > > > > - > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > > > > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: Versioning of Spark Operator
Thanks Liang-Chi for the Spark Operator work, and also the discussion here! For Spark Operator and Connector Go Client, I am guessing they need to support multiple versions of Spark? e.g. same Spark Operator may support running multiple versions of Spark, and Connector Go Client might support multiple versions of Spark driver as well. How do people think of using the minimum supported Spark version as the version name for Spark Operator and Connector Go Client? For example, Spark Operator 3.5.x supports Spark 3.5 and above. Best, Bo On Tue, Apr 9, 2024 at 10:14 AM Dongjoon Hyun wrote: > Ya, that's simple and possible. > > However, it may cause many confusions because it implies that new `Spark > K8s Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic > Versioning` policy like Apache Spark 4.0.0. > > In addition, `Versioning` is directly related to the Release Cadence. It's > unlikely for us to have `Spark K8s Operator` and `Spark Connect Go` > releases at every Apache Spark maintenance release. For example, there is > no commit in Spark Connect Go repository. > > I believe the versioning and release cadence is related to those > subprojects' maturity more. > > Dongjoon. > > On 2024/04/09 16:59:40 DB Tsai wrote: > > Aligning with Spark releases is sensible, as it allows us to guarantee > that the Spark operator functions correctly with the new version while also > maintaining support for previous versions. > > > > DB Tsai | https://www.dbtsai.com/ | PGP 42E5B25A8F7A82C1 > > > > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan > wrote: > > > > > > > > > I am trying to understand if we can simply align with Spark's > version for this ? > > > Makes the release and jira management much more simpler for developers > and intuitive for users. > > > > > > Regards, > > > Mridul > > > > > > > > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun <mailto:dongj...@apache.org>> wrote: > > >> Hi, Liang-Chi. > > >> > > >> Thank you for leading Apache Spark K8s operator as a shepherd. > > >> > > >> I took a look at `Apache Spark Connect Go` repo mentioned in the > thread. Sadly, there is no release at all and no activity since last 6 > months. It seems to be the first time for Apache Spark community to > consider these sister repositories (Go and K8s Operator). > > >> > > >> https://github.com/apache/spark-connect-go/commits/master/ > > >> > > >> Dongjoon. > > >> > > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote: > > >> > Hi all, > > >> > > > >> > We've opened the dedicated repository of Spark Kubernetes Operator, > > >> > and the first PR is created. > > >> > Thank you for the review from the community so far. > > >> > > > >> > About the versioning of Spark Operator, there are questions. > > >> > > > >> > As we are using Spark JIRA, when we are going to merge PRs, we need > to > > >> > choose a Spark version. However, the Spark Operator is versioning > > >> > differently than Spark. I'm wondering how we deal with this? > > >> > > > >> > Not sure if Connect also has its versioning different to Spark? If > so, > > >> > maybe we can follow how Connect does. > > >> > > > >> > Can someone who is familiar with Connect versioning give some > suggestions? > > >> > > > >> > Thank you. > > >> > > > >> > Liang-Chi > > >> > > > >> > > - > > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org dev-unsubscr...@spark.apache.org> > > >> > > > >> > > > >> > > >> - > > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org dev-unsubscr...@spark.apache.org> > > >> > > > > > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >
Re: Versioning of Spark Operator
Ya, that's simple and possible. However, it may cause many confusions because it implies that new `Spark K8s Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic Versioning` policy like Apache Spark 4.0.0. In addition, `Versioning` is directly related to the Release Cadence. It's unlikely for us to have `Spark K8s Operator` and `Spark Connect Go` releases at every Apache Spark maintenance release. For example, there is no commit in Spark Connect Go repository. I believe the versioning and release cadence is related to those subprojects' maturity more. Dongjoon. On 2024/04/09 16:59:40 DB Tsai wrote: > Aligning with Spark releases is sensible, as it allows us to guarantee that > the Spark operator functions correctly with the new version while also > maintaining support for previous versions. > > DB Tsai | https://www.dbtsai.com/ | PGP 42E5B25A8F7A82C1 > > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan wrote: > > > > > > I am trying to understand if we can simply align with Spark's version for > > this ? > > Makes the release and jira management much more simpler for developers and > > intuitive for users. > > > > Regards, > > Mridul > > > > > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun > <mailto:dongj...@apache.org>> wrote: > >> Hi, Liang-Chi. > >> > >> Thank you for leading Apache Spark K8s operator as a shepherd. > >> > >> I took a look at `Apache Spark Connect Go` repo mentioned in the thread. > >> Sadly, there is no release at all and no activity since last 6 months. It > >> seems to be the first time for Apache Spark community to consider these > >> sister repositories (Go and K8s Operator). > >> > >> https://github.com/apache/spark-connect-go/commits/master/ > >> > >> Dongjoon. > >> > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote: > >> > Hi all, > >> > > >> > We've opened the dedicated repository of Spark Kubernetes Operator, > >> > and the first PR is created. > >> > Thank you for the review from the community so far. > >> > > >> > About the versioning of Spark Operator, there are questions. > >> > > >> > As we are using Spark JIRA, when we are going to merge PRs, we need to > >> > choose a Spark version. However, the Spark Operator is versioning > >> > differently than Spark. I'm wondering how we deal with this? > >> > > >> > Not sure if Connect also has its versioning different to Spark? If so, > >> > maybe we can follow how Connect does. > >> > > >> > Can someone who is familiar with Connect versioning give some > >> > suggestions? > >> > > >> > Thank you. > >> > > >> > Liang-Chi > >> > > >> > - > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >> > <mailto:dev-unsubscr...@spark.apache.org> > >> > > >> > > >> > >> - > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >> <mailto:dev-unsubscr...@spark.apache.org> > >> > > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: Versioning of Spark Operator
Aligning with Spark releases is sensible, as it allows us to guarantee that the Spark operator functions correctly with the new version while also maintaining support for previous versions. DB Tsai | https://www.dbtsai.com/ | PGP 42E5B25A8F7A82C1 > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan wrote: > > > I am trying to understand if we can simply align with Spark's version for > this ? > Makes the release and jira management much more simpler for developers and > intuitive for users. > > Regards, > Mridul > > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun <mailto:dongj...@apache.org>> wrote: >> Hi, Liang-Chi. >> >> Thank you for leading Apache Spark K8s operator as a shepherd. >> >> I took a look at `Apache Spark Connect Go` repo mentioned in the thread. >> Sadly, there is no release at all and no activity since last 6 months. It >> seems to be the first time for Apache Spark community to consider these >> sister repositories (Go and K8s Operator). >> >> https://github.com/apache/spark-connect-go/commits/master/ >> >> Dongjoon. >> >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote: >> > Hi all, >> > >> > We've opened the dedicated repository of Spark Kubernetes Operator, >> > and the first PR is created. >> > Thank you for the review from the community so far. >> > >> > About the versioning of Spark Operator, there are questions. >> > >> > As we are using Spark JIRA, when we are going to merge PRs, we need to >> > choose a Spark version. However, the Spark Operator is versioning >> > differently than Spark. I'm wondering how we deal with this? >> > >> > Not sure if Connect also has its versioning different to Spark? If so, >> > maybe we can follow how Connect does. >> > >> > Can someone who is familiar with Connect versioning give some suggestions? >> > >> > Thank you. >> > >> > Liang-Chi >> > >> > - >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >> > <mailto:dev-unsubscr...@spark.apache.org> >> > >> > >> >> - >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >> <mailto:dev-unsubscr...@spark.apache.org> >>
Re: Versioning of Spark Operator
I am trying to understand if we can simply align with Spark's version for this ? Makes the release and jira management much more simpler for developers and intuitive for users. Regards, Mridul On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun wrote: > Hi, Liang-Chi. > > Thank you for leading Apache Spark K8s operator as a shepherd. > > I took a look at `Apache Spark Connect Go` repo mentioned in the thread. > Sadly, there is no release at all and no activity since last 6 months. It > seems to be the first time for Apache Spark community to consider these > sister repositories (Go and K8s Operator). > > https://github.com/apache/spark-connect-go/commits/master/ > > Dongjoon. > > On 2024/04/08 17:48:18 "L. C. Hsieh" wrote: > > Hi all, > > > > We've opened the dedicated repository of Spark Kubernetes Operator, > > and the first PR is created. > > Thank you for the review from the community so far. > > > > About the versioning of Spark Operator, there are questions. > > > > As we are using Spark JIRA, when we are going to merge PRs, we need to > > choose a Spark version. However, the Spark Operator is versioning > > differently than Spark. I'm wondering how we deal with this? > > > > Not sure if Connect also has its versioning different to Spark? If so, > > maybe we can follow how Connect does. > > > > Can someone who is familiar with Connect versioning give some > suggestions? > > > > Thank you. > > > > Liang-Chi > > > > - > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > > > > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >
Re: Versioning of Spark Operator
Hi, Liang-Chi. Thank you for leading Apache Spark K8s operator as a shepherd. I took a look at `Apache Spark Connect Go` repo mentioned in the thread. Sadly, there is no release at all and no activity since last 6 months. It seems to be the first time for Apache Spark community to consider these sister repositories (Go and K8s Operator). https://github.com/apache/spark-connect-go/commits/master/ Dongjoon. On 2024/04/08 17:48:18 "L. C. Hsieh" wrote: > Hi all, > > We've opened the dedicated repository of Spark Kubernetes Operator, > and the first PR is created. > Thank you for the review from the community so far. > > About the versioning of Spark Operator, there are questions. > > As we are using Spark JIRA, when we are going to merge PRs, we need to > choose a Spark version. However, the Spark Operator is versioning > differently than Spark. I'm wondering how we deal with this? > > Not sure if Connect also has its versioning different to Spark? If so, > maybe we can follow how Connect does. > > Can someone who is familiar with Connect versioning give some suggestions? > > Thank you. > > Liang-Chi > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Versioning of Spark Operator
Hi all, We've opened the dedicated repository of Spark Kubernetes Operator, and the first PR is created. Thank you for the review from the community so far. About the versioning of Spark Operator, there are questions. As we are using Spark JIRA, when we are going to merge PRs, we need to choose a Spark version. However, the Spark Operator is versioning differently than Spark. I'm wondering how we deal with this? Not sure if Connect also has its versioning different to Spark? If so, maybe we can follow how Connect does. Can someone who is familiar with Connect versioning give some suggestions? Thank you. Liang-Chi - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org