On Wed, Jun 27, 2018 at 6:57 PM, Felix Cheung <felixcheun...@hotmail.com> wrote:
> Yes, this is broken with newer version of R.
>
> We check explicitly for warning for the R check which should fail the test
> run.

Hmm, something is missing somewhere then, because Jenkins seems mostly
happy aside from a few flakes:
https://amplab.cs.berkeley.edu/jenkins/user/vanzin/my-views/view/Spark/

(Look for the 2.1 branch jobs.)


> ________________________________
> From: Marcelo Vanzin <van...@cloudera.com>
> Sent: Wednesday, June 27, 2018 6:55 PM
> To: Felix Cheung
> Cc: Marcelo Vanzin; Tom Graves; dev
>
> Subject: Re: [VOTE] Spark 2.1.3 (RC2)
>
> Not sure I understand that bug. Is it a compatibility issue with new
> versions of R?
>
> It's at least marked as fixed in 2.2(.1).
>
> We do run jenkins on these branches, but that seems like just a
> warning, which would not fail those builds...
>
> On Wed, Jun 27, 2018 at 6:12 PM, Felix Cheung <felixcheun...@hotmail.com>
> wrote:
>> (I don’t want to block the release(s) per se...)
>>
>> We need to backport SPARK-22281 (to branch-2.1 and branch-2.2)
>>
>> This is fixed in 2.3 back in Nov 2017
>>
>> https://github.com/apache/spark/commit/2ca5aae47a25dc6bc9e333fb592025ff14824501#diff-e1e1d3d40573127e9ee0480caf1283d6
>>
>> Perhaps we don't get Jenkins run on these branches? It should have been
>> detected.
>>
>> * checking for code/documentation mismatches ... WARNING
>> Codoc mismatches from documentation object 'attach':
>> attach
>> Code: function(what, pos = 2L, name = deparse(substitute(what),
>> backtick = FALSE), warn.conflicts = TRUE)
>> Docs: function(what, pos = 2L, name = deparse(substitute(what)),
>> warn.conflicts = TRUE)
>> Mismatches in argument default values:
>> Name: 'name' Code: deparse(substitute(what), backtick = FALSE) Docs:
>> deparse(substitute(what))
>>
>> Codoc mismatches from documentation object 'glm':
>> glm
>> Code: function(formula, family = gaussian, data, weights, subset,
>> na.action, start = NULL, etastart, mustart, offset,
>> control = list(...), model = TRUE, method = "glm.fit",
>> x = FALSE, y = TRUE, singular.ok = TRUE, contrasts =
>> NULL, ...)
>> Docs: function(formula, family = gaussian, data, weights, subset,
>> na.action, start = NULL, etastart, mustart, offset,
>> control = list(...), model = TRUE, method = "glm.fit",
>> x = FALSE, y = TRUE, contrasts = NULL, ...)
>> Argument names in code not in docs:
>> singular.ok
>> Mismatches in argument names:
>> Position: 16 Code: singular.ok Docs: contrasts
>> Position: 17 Code: contrasts Docs: ...
>>
>> ________________________________
>> From: Sean Owen <sro...@apache.org>
>> Sent: Wednesday, June 27, 2018 5:02:37 AM
>> To: Marcelo Vanzin
>> Cc: dev
>> Subject: Re: [VOTE] Spark 2.1.3 (RC2)
>>
>> +1 from me too for the usual reasons.
>>
>> On Tue, Jun 26, 2018 at 3:25 PM Marcelo Vanzin
>> <van...@cloudera.com.invalid>
>> wrote:
>>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 2.1.3.
>>>
>>> The vote is open until Fri, June 29th @ 9PM UTC (2PM PDT) and passes if a
>>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.1.3
>>> [ ] -1 Do not release this package because ...
>>>
>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>
>>> The tag to be voted on is v2.1.3-rc2 (commit b7eac07b):
>>> https://github.com/apache/spark/tree/v2.1.3-rc2
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.1.3-rc2-bin/
>>>
>>> Signatures used for Spark RCs can be found in this file:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1275/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.1.3-rc2-docs/
>>>
>>> The list of bug fixes going into 2.1.3 can be found at the following URL:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12341660
>>>
>>> Notes:
>>>
>>> - RC1 was not sent for a vote. I had trouble building it, and by the time
>>> I got
>>> things fixed, there was a blocker bug filed. It was already tagged in
>>> git
>>> at that time.
>>>
>>> - If testing the source package, I recommend using Java 8, even though
>>> 2.1
>>> supports Java 7 (and the RC was built with JDK 7). This is because Maven
>>> Central has updated some configuration that makes the default Java 7 SSL
>>> config not work.
>>>
>>> - There are Maven artifacts published for Scala 2.10, but binary
>>> releases are only
>>> available for Scala 2.11. This matches the previous release (2.1.2),
>>> but if there's
>>> a need / desire to have pre-built distributions for Scala 2.10, I can
>>> probably
>>> amend the RC without having to create a new one.
>>>
>>> FAQ
>>>
>>> =========================
>>> How can I help test this release?
>>> =========================
>>>
>>> If you are a Spark user, you can help us test this release by taking
>>> an existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test
>>> with the RC (make sure to clean up the artifact cache before/after so
>>> you don't end up building with a out of date RC going forward).
>>>
>>> ===========================================
>>> What should happen to JIRA tickets still targeting 2.1.3?
>>> ===========================================
>>>
>>> The current list of open tickets targeted at 2.1.3 can be found at:
>>> https://s.apache.org/spark-2.1.3
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should
>>> be worked on immediately. Everything else please retarget to an
>>> appropriate release.
>>>
>>> ==================
>>> But my bug isn't fixed?
>>> ==================
>>>
>>> In order to make timely releases, we will typically not hold the
>>> release unless the bug in question is a regression from the previous
>>> release. That being said, if there is something which is a regression
>>> that has not been correctly targeted please ping me or a committer to
>>> help target the issue.
>>>
>>>
>>> --
>>> Marcelo
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>
>
>
>
> --
> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to