Thanks @Hyukjin Kwon <gurwls...@gmail.com> . Yes I'm using python2 to build
docs, looks like Python2 with Sphinx has issues.

What is the pending thing for this PR (
https://github.com/apache/spark/pull/21659)? I'm planning to cut RC2 once
this is merged, do you an ETA for this PR?

Hyukjin Kwon <gurwls...@gmail.com> 于2018年7月9日周一 上午9:06写道:

> Seems Python 2's Sphinx was used -
> https://dist.apache.org/repos/dist/dev/spark/v2.3.2-rc1-docs/_site/api/python/pyspark.ml.html#pyspark.ml.classification.LogisticRegression
> and SPARK-24530 issue exists in the RC. it's kind of tricky to manually
> verify if Python 3 is used given my few tries in my local.
>
> I think the fix against SPARK-24530 is technically not merged yet;
> however, I don't think this blocks the release like the previous release. I
> think we could proceed in parallel.
> Will probably make a progress on
> https://github.com/apache/spark/pull/21659, and fix the release doc too.
>
>
> 2018년 7월 9일 (월) 오전 8:25, Saisai Shao <sai.sai.s...@gmail.com>님이 작성:
>
>> Hi Sean,
>>
>> SPARK-24530 is not included in this RC1 release. Actually I'm so familiar
>> with this issue so still using python2 to generate docs.
>>
>> In the JIRA it mentioned that python3 with sphinx could workaround this
>> issue. @Hyukjin Kwon <gurwls...@gmail.com> would you please help to
>> clarify?
>>
>> Thanks
>> Saisai
>>
>>
>> Xiao Li <gatorsm...@gmail.com> 于2018年7月9日周一 上午1:59写道:
>>
>>> Three business days might be too short. Let us open the vote until the
>>> end of this Friday (July 13th)?
>>>
>>> Cheers,
>>>
>>> Xiao
>>>
>>> 2018-07-08 10:15 GMT-07:00 Sean Owen <sro...@apache.org>:
>>>
>>>> Just checking that the doc issue in
>>>> https://issues.apache.org/jira/browse/SPARK-24530 is worked around in
>>>> this release?
>>>>
>>>> This was pointed out as an example of a broken doc:
>>>>
>>>> https://spark.apache.org/docs/2.3.1/api/python/pyspark.ml.html#pyspark.ml.classification.LogisticRegression
>>>>
>>>> Here it is in 2.3.2 RC1:
>>>>
>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.2-rc1-docs/_site/api/python/pyspark.ml.html#pyspark.ml.classification.LogisticRegression
>>>>
>>>> It wasn't immediately obvious to me whether this addressed the issue
>>>> that was identified or not.
>>>>
>>>>
>>>> Otherwise nothing is open for 2.3.2, sigs and license look good, tests
>>>> pass as last time, etc.
>>>>
>>>> +1
>>>>
>>>> On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <sai.sai.s...@gmail.com>
>>>> wrote:
>>>>
>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>> version 2.3.2.
>>>>>
>>>>> The vote is open until July 11th PST and passes if a majority +1 PMC
>>>>> votes are cast, with a minimum of 3 +1 votes.
>>>>>
>>>>> [ ] +1 Release this package as Apache Spark 2.3.2
>>>>> [ ] -1 Do not release this package because ...
>>>>>
>>>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>>>
>>>>> The tag to be voted on is v2.3.2-rc1
>>>>> (commit 4df06b45160241dbb331153efbb25703f913c192):
>>>>> https://github.com/apache/spark/tree/v2.3.2-rc1
>>>>>
>>>>> The release files, including signatures, digests, etc. can be found at:
>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.2-rc1-bin/
>>>>>
>>>>> Signatures used for Spark RCs can be found in this file:
>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>
>>>>> The staging repository for this release can be found at:
>>>>> https://repository.apache.org/content/repositories/orgapachespark-1277/
>>>>>
>>>>> The documentation corresponding to this release can be found at:
>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.2-rc1-docs/
>>>>>
>>>>> The list of bug fixes going into 2.3.2 can be found at the following
>>>>> URL:
>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12343289
>>>>>
>>>>> PS. This is my first time to do release, please help to check if
>>>>> everything is landing correctly. Thanks ^-^
>>>>>
>>>>> FAQ
>>>>>
>>>>> =========================
>>>>> How can I help test this release?
>>>>> =========================
>>>>>
>>>>> If you are a Spark user, you can help us test this release by taking
>>>>> an existing Spark workload and running on this release candidate, then
>>>>> reporting any regressions.
>>>>>
>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>> the current RC and see if anything important breaks, in the Java/Scala
>>>>> you can add the staging repository to your projects resolvers and test
>>>>> with the RC (make sure to clean up the artifact cache before/after so
>>>>> you don't end up building with a out of date RC going forward).
>>>>>
>>>>> ===========================================
>>>>> What should happen to JIRA tickets still targeting 2.3.2?
>>>>> ===========================================
>>>>>
>>>>> The current list of open tickets targeted at 2.3.2 can be found at:
>>>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>>>> Version/s" = 2.3.2
>>>>>
>>>>> Committers should look at those and triage. Extremely important bug
>>>>> fixes, documentation, and API tweaks that impact compatibility should
>>>>> be worked on immediately. Everything else please retarget to an
>>>>> appropriate release.
>>>>>
>>>>> ==================
>>>>> But my bug isn't fixed?
>>>>> ==================
>>>>>
>>>>> In order to make timely releases, we will typically not hold the
>>>>> release unless the bug in question is a regression from the previous
>>>>> release. That being said, if there is something which is a regression
>>>>> that has not been correctly targeted please ping me or a committer to
>>>>> help target the issue.
>>>>>
>>>>
>>>

Reply via email to