Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/19290
argh, thanks for the reminder and the fix.
I knew calling internal method is going to bite us ... :(
---
-
To unsubscribe
Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/19290
I believe the issue with testthat he meatn is, it fails to run tests with
testthat 2.0.0 is released. Just for a reminder, I rushed to update
`appveyor.yml` to use a fixed version
https://githu
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/19290
Right we could bump the supported R version for the next release. It should
have minimal impact (since we are testing the close to the latest on
appveyor... somewhat internally)
lintr i
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/19290
The minimum R version supported is something that we can revisit though. I
think we do this for Python, Java versions as well in the project
---
--
Github user shaneknapp commented on the issue:
https://github.com/apache/spark/pull/19290
ok sounds good -- we'll keep things 'old' for now.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For addi
Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/19290
BTW, I believe we are testing it with R 3.4.1 via AppVeyor too. I have been
thinking it's good to test both old and new versions ... I think we have a
weak promise for `R 3.1+` -
http://spark
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/19290
Given that we are forking 2.3 and locking down the branch any time now, it
might make sense to stay with the "current version running on old centos
workers", even though they are old.
---
--
Github user shaneknapp commented on the issue:
https://github.com/apache/spark/pull/19290
the driving force behind this is that the version of R (and all associated
packages) is really quite old on the centos workers (3.1.1), and i'd really
like us to get as up to date as possib
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/19290
Shane is this going to affect one particular branch (eg. 2.3.0), or is it
going to be all branches and all test runs?
The changes are fairly substantial - if we need to back port then it
Github user shaneknapp commented on the issue:
https://github.com/apache/spark/pull/19290
i filed a bug about this on the spark jira:
https://issues.apache.org/jira/browse/SPARK-22996
as we're about to move all the spark builds to new ubuntu machines, w/much
more up2date p
Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/19290
So, @felixcheung, @shaneknapp and @shivaram, looks we have comments,
https://github.com/apache/spark/pull/19290#issuecomment-21991 and
https://github.com/apache/spark/pull/19290#issuecomment
Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/19290
Merged to master.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/19290
Let me merge this one first. This shouldn't cause any problem to built
system for now.
---
-
To unsubscribe, e-mail: reviews
Github user shaneknapp commented on the issue:
https://github.com/apache/spark/pull/19290
@felixcheung -- yes, this is the system default lintr, meaning all calls
to lintr will be against this version.
as for other branches, i think it could possibly break them.
@sh
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/19290
btw, lintr once upgraded will run on all builds on all branches right?
wouldn't the upgrade break other branches?
---
-
To u
15 matches
Mail list logo