I've filed JIRA SPARK-22055 & SPARK-22054 to port the
> release scripts and allow injecting of the RM's key.
>
> On Mon, Sep 18, 2017 at 8:11 PM, Patrick Wendell <patr...@databricks.com>
> wrote:
>
>> For the current release - maybe Holden could just s
For the current release - maybe Holden could just sign the artifacts with
her own key manually, if this is a concern. I don't think that would
require modifying the release pipeline, except to just remove/ignore the
existing signatures.
- Patrick
On Mon, Sep 18, 2017 at 7:56 PM, Reynold Xin
://github.com/apache/spark/tree/master/dev/create-release
- Patrick
On Mon, Sep 18, 2017 at 6:23 PM, Patrick Wendell <patr...@databricks.com>
wrote:
> One thing we could do is modify the release tooling to allow the key to be
> injected each time, thus allowing any RM to insert t
extra care to make sure that can't happen, even if it
> is an annoyance for the release managers.
>
> On Sun, Sep 17, 2017 at 10:12 PM, Patrick Wendell <patr...@databricks.com>
> wrote:
>
>> Sparks release pipeline is automated and part of that automation includes
>> secu
Sparks release pipeline is automated and part of that automation includes
securely injecting this key for the purpose of signing. I asked the ASF to
provide a service account key several years ago but they suggested that we
use a key attributed to an individual even if the process is automated.
I
[
https://issues.apache.org/jira/browse/SPARK-16685?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391168#comment-15391168
]
Patrick Wendell commented on SPARK-16685:
-
These scripts are pretty old and I'm not sure
[
https://issues.apache.org/jira/browse/SPARK-13855?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell resolved SPARK-13855.
-
Resolution: Fixed
Fix Version/s: 1.6.1
> Spark 1.6.1 artifacts not found in
[
https://issues.apache.org/jira/browse/SPARK-13855?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15196901#comment-15196901
]
Patrick Wendell commented on SPARK-13855:
-
I've uploaded the artifacts, thanks.
> Spark 1.
[
https://issues.apache.org/jira/browse/SPARK-13855?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell reassigned SPARK-13855:
---
Assignee: Patrick Wendell (was: Michael Armbrust)
> Spark 1.6.1 artifa
+1
On Wed, Dec 16, 2015 at 6:15 PM, Ted Yu wrote:
> Ran test suite (minus docker-integration-tests)
> All passed
>
> +1
>
> [INFO] Spark Project External ZeroMQ .. SUCCESS [
> 13.647 s]
> [INFO] Spark Project External Kafka ...
[
https://issues.apache.org/jira/browse/SPARK-12148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-12148:
Priority: Major (was: Critical)
> SparkR: rename DataFrame to SparkDataFr
[
https://issues.apache.org/jira/browse/SPARK-12148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-12148:
Priority: Critical (was: Minor)
> SparkR: rename DataFrame to SparkDataFr
[
https://issues.apache.org/jira/browse/SPARK-12148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-12148:
Issue Type: Improvement (was: Wish)
> SparkR: rename DataFrame to SparkDataFr
[
https://issues.apache.org/jira/browse/SPARK-12110?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-12110:
Description:
I am using spark-1.5.1-bin-hadoop2.6. I used
spark-1.5.1-bin-hadoop2.6/ec2
[
https://issues.apache.org/jira/browse/SPARK-12110?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-12110:
Component/s: (was: ML)
(was: SQL
[
https://issues.apache.org/jira/browse/SPARK-12110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15036960#comment-15036960
]
Patrick Wendell commented on SPARK-12110:
-
Hey Andrew, could you show exactly the command you
In terms of advertising to people the status of the release and whether an
RC is likely to go out, the best mechanism I can think of is our current
mechanism of using JIRA and respecting the semantics of a blocker JIRA. We
could do a better job though creating a JIRA dashboard for each release and
[
https://issues.apache.org/jira/browse/SPARK-11903?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15021511#comment-15021511
]
Patrick Wendell commented on SPARK-11903:
-
I think it's simply dead code. SKIP_JAVA_TEST related
[
https://issues.apache.org/jira/browse/SPARK-11903?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15021511#comment-15021511
]
Patrick Wendell edited comment on SPARK-11903 at 11/23/15 4:29 AM:
---
I
I also feel the same as Reynold. I agree we should minimize API breaks and
focus on fixing things around the edge that were mistakes (e.g. exposing
Guava and Akka) rather than any overhaul that could fragment the community.
Ideally a major release is a lightweight process we can do every couple of
[
https://issues.apache.org/jira/browse/SPARK-11326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14997448#comment-14997448
]
Patrick Wendell commented on SPARK-11326:
-
There are a few related conversations here:
1
Hey Jakob,
The builds in Spark are largely maintained by me, Sean, and Michael
Armbrust (for SBT). For historical reasons, Spark supports both a Maven and
SBT build. Maven is the build of reference for packaging Spark and is used
by many downstream packagers and to build all Spark releases. SBT
I believe this is some bug in our tests. For some reason we are using way
more memory than necessary. We'll probably need to log into Jenkins and
heap dump some running tests and figure out what is going on.
On Mon, Nov 2, 2015 at 7:42 AM, Ted Yu wrote:
> Looks like
[
https://issues.apache.org/jira/browse/SPARK-11236?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell resolved SPARK-11236.
-
Resolution: Fixed
Fix Version/s: 1.6.0
> Upgrade Tachyon dependency to 0.
[
https://issues.apache.org/jira/browse/SPARK-11236?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-11236:
Assignee: Calvin Jia
> Upgrade Tachyon dependency to 0.
[
https://issues.apache.org/jira/browse/SPARK-11446?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-11446:
Target Version/s: 1.6.0
> Spark 1.6 release no
Patrick Wendell created SPARK-11446:
---
Summary: Spark 1.6 release notes
Key: SPARK-11446
URL: https://issues.apache.org/jira/browse/SPARK-11446
Project: Spark
Issue Type: Task
[
https://issues.apache.org/jira/browse/SPARK-11238?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14984646#comment-14984646
]
Patrick Wendell commented on SPARK-11238:
-
I created SPARK-11446 and linked it here.
> Spa
[
https://issues.apache.org/jira/browse/SPARK-11446?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14984776#comment-14984776
]
Patrick Wendell commented on SPARK-11446:
-
I think this is redundant with the "releasenotes
[
https://issues.apache.org/jira/browse/SPARK-11446?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell closed SPARK-11446.
---
Resolution: Invalid
> Spark 1.6 release no
I verified that the issue with build binaries being present in the source
release is fixed. Haven't done enough vetting for a full vote, but did
verify that.
On Sun, Oct 25, 2015 at 12:07 AM, Reynold Xin wrote:
> Please vote on releasing the following candidate as Apache
Hi All,
I would like to be added to the Incubator PMC to help mentor a new project.
I am an Apache Member. I am not sure the exact process to be added, so I am
emailing this list as a first step!
Cheers,
- Patrick
[
https://issues.apache.org/jira/browse/SPARK-11305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14973493#comment-14973493
]
Patrick Wendell commented on SPARK-11305:
-
/cc [~srowen] for his thoughts.
> Remove Third-Pa
Patrick Wendell created SPARK-11305:
---
Summary: Remove Third-Party Hadoop Distributions Doc Page
Key: SPARK-11305
URL: https://issues.apache.org/jira/browse/SPARK-11305
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-10971?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14973510#comment-14973510
]
Patrick Wendell commented on SPARK-10971:
-
Reynold has sent out the vote email based
[
https://issues.apache.org/jira/browse/SPARK-10971?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14973510#comment-14973510
]
Patrick Wendell edited comment on SPARK-10971 at 10/26/15 12:02 AM
[
https://issues.apache.org/jira/browse/SPARK-10971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10971:
Fix Version/s: (was: 1.5.2)
1.5.3
> sparkR: RRunner should al
de me w/some specific failures so i can look
> in to them more closely?
>
> On Mon, Oct 19, 2015 at 12:27 PM, Patrick Wendell <pwend...@gmail.com>
> wrote:
> > Hey Shane,
> >
> > It also appears that every Spark build is failing right now. Could it be
> > related
I think many of them are coming form the Spark 1.4 builds:
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/Spark-1.4-Maven-pre-YARN/3900/console
On Mon, Oct 19, 2015 at 1:44 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> This is what I'
Hey Shane,
It also appears that every Spark build is failing right now. Could it be
related to your changes?
- Patrick
On Mon, Oct 19, 2015 at 11:13 AM, shane knapp wrote:
> worker 05 is back up now... looks like the machine OOMed and needed
> to be kicked.
>
> On Mon,
[
https://issues.apache.org/jira/browse/SPARK-11070?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell reassigned SPARK-11070:
---
Assignee: Patrick Wendell
> Remove older releases on dist.apache.
[
https://issues.apache.org/jira/browse/SPARK-11070?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14961515#comment-14961515
]
Patrick Wendell commented on SPARK-11070:
-
I removed them - I did leave 1.5.0 for now, but we can
[
https://issues.apache.org/jira/browse/SPARK-11070?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell resolved SPARK-11070.
-
Resolution: Fixed
> Remove older releases on dist.apache.
[
https://issues.apache.org/jira/browse/SPARK-10877?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10877:
Assignee: Davies Liu
> Assertions fail straightforward DataFrame job due to word alignm
I would tend to agree with this approach. We should audit all
@Experimenetal labels before the 1.6 release and clear them out when
appropriate.
- Patrick
On Wed, Oct 14, 2015 at 2:13 AM, Sean Owen wrote:
> Someone asked, is "ML pipelines" stable? I said, no, most of the key
Hi Jakob,
There is a temporary issue with the Scala 2.11 build in SBT. The problem is
this wasn't previously covered by our automated tests so it broke without
us knowing - this has been actively discussed on the dev list in the last
24 hours. I am trying to get it working in our test harness
[
https://issues.apache.org/jira/browse/SPARK-0?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-0:
Assignee: Jakob Odersky
> Scala 2.11 build fails due to compiler err
[
https://issues.apache.org/jira/browse/SPARK-0?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-0:
Priority: Critical (was: Major)
> Scala 2.11 build fails due to compiler err
Patrick Wendell created SPARK-0:
---
Summary: Scala 2.11 build fails due to compiler errors
Key: SPARK-0
URL: https://issues.apache.org/jira/browse/SPARK-0
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-5?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14958078#comment-14958078
]
Patrick Wendell edited comment on SPARK-5 at 10/15/15 12:38 AM
[
https://issues.apache.org/jira/browse/SPARK-11081?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-11081:
Component/s: Build
> Shade Jersey dependency to work around the compatibility is
[
https://issues.apache.org/jira/browse/SPARK-11092?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-11092:
Assignee: Jakob Odersky
> Add source URLs to API documentat
[
https://issues.apache.org/jira/browse/SPARK-5?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-5:
Summary: Host verification is not correct for IPv6 (was: IPv6 regression)
> H
[
https://issues.apache.org/jira/browse/SPARK-5?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14958078#comment-14958078
]
Patrick Wendell commented on SPARK-5:
-
The title of this says "Regression&q
[
https://issues.apache.org/jira/browse/SPARK-11006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-11006:
Component/s: SQL
> Rename NullColumnAccess as NullColumnAcces
[
https://issues.apache.org/jira/browse/SPARK-1?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-1:
Component/s: SQL
> Fast null-safe join
> ---
>
>
[
https://issues.apache.org/jira/browse/SPARK-11056?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-11056:
Component/s: Documentation
> Improve documentation on how to build Spark efficien
[
https://issues.apache.org/jira/browse/SPARK-6230?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14954523#comment-14954523
]
Patrick Wendell commented on SPARK-6230:
Should we update Spark's documentation to explain this? I
there was a reason this is hard. Certainly not
> worth building for each PR.
>
> On Mon, Oct 12, 2015 at 5:16 PM, Patrick Wendell <pwend...@gmail.com>
> wrote:
> > We already do automated compile testing for Scala 2.11 similar to Hadoop
> > versions:
> >
> >
I think Daniel is correct here. The source artifact incorrectly includes
jars. It is inadvertent and not part of our intended release process. This
was something I noticed in Spark 1.5.0 and filed a JIRA and was fixed by
updating our build scripts to fix it. However, our build environment was
not
*to not include binaries.
On Sun, Oct 11, 2015 at 9:35 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> I think Daniel is correct here. The source artifact incorrectly includes
> jars. It is inadvertent and not part of our intended release process. This
> was something I noticed
like they are part of tests, and still nothing to do with Spark binaries.
> Those can and should stay.
>
> On Mon, Oct 12, 2015, 5:35 AM Patrick Wendell <pwend...@gmail.com> wrote:
>
>> I think Daniel is correct here. The source artifact incorrectly includes
>>
ifacts should be a deterministic function of the
> source at a certain point in time.
>
> I think the concern is about putting Spark binaries or its dependencies
> into a source release. That should not happen, but it is not what has
> happened here.
>
> On Mon, Oct 12, 2015, 6:03 AM
I would push back slightly. The reason we have the PR builds taking so long
is death by a million small things that we add. Doing a full 2.11 compile
is order minutes... it's a nontrivial increase to the build times.
It doesn't seem that bad to me to go back post-hoc once in a while and fix
2.11
the foreseeable future?
>
> Nick
>
>
> On Tue, Oct 6, 2015 at 1:13 AM Patrick Wendell <pwend...@gmail.com> wrote:
>
>> The missing artifacts are uploaded now. Things should propagate in the
>> next 24 hours. If there are still issues past then ping this
Hey Holden,
It would be helpful if you could outline the set of features you'd imagine
being part of Spark in a short doc. I didn't see a README on the existing
repo, so it's hard to know exactly what is being proposed.
As a general point of process, we've typically avoided merging modules into
The missing artifacts are uploaded now. Things should propagate in the next
24 hours. If there are still issues past then ping this thread. Thanks!
- Patrick
On Mon, Oct 5, 2015 at 2:41 PM, Nicholas Chammas wrote:
> Thanks for looking into this Josh.
>
> On Mon,
, Oct 1, 2015 at 11:30 AM, Patrick Wendell <pwend...@gmail.com> wrote:
> Ah - I can update it. Usually i do it after the release is cut. It's
> just a standard 3 month cadence.
>
> On Thu, Oct 1, 2015 at 3:55 AM, Sean Owen <so...@cloudera.com> wrote:
>> My guess is th
Ah - I can update it. Usually i do it after the release is cut. It's
just a standard 3 month cadence.
On Thu, Oct 1, 2015 at 3:55 AM, Sean Owen wrote:
> My guess is that the 1.6 merge window should close at the end of
> November (2 months from now)? I can update it but wanted
Hey Richard,
My assessment (just looked before I saw Sean's email) is the same as
his. The NOTICE file embeds other projects' licenses. If those
licenses themselves have pointers to other files or dependencies, we
don't embed them. I think this is standard practice.
- Patrick
On Thu, Sep 24,
I think it would be a big improvement to get rid of it. It's not how
jars are supposed to be packaged and it has caused problems in many
different context over the years.
For me a key step in moving away would be to fully audit/understand
all compatibility implications of removing it. If other
I just added snapshot builds for 1.5. They will take a few hours to
build, but once we get them working should publish every few hours.
https://amplab.cs.berkeley.edu/jenkins/view/Spark-Packaging
- Patrick
On Mon, Sep 21, 2015 at 10:36 PM, Bin Wang wrote:
> However I find
Patrick Wendell created FLINK-2699:
--
Summary: Flink is filling Spark JIRA with incorrect PR links
Key: FLINK-2699
URL: https://issues.apache.org/jira/browse/FLINK-2699
Project: Flink
Issue
Patrick Wendell created FLINK-2699:
--
Summary: Flink is filling Spark JIRA with incorrect PR links
Key: FLINK-2699
URL: https://issues.apache.org/jira/browse/FLINK-2699
Project: Flink
Issue
[
https://issues.apache.org/jira/browse/FLINK-2699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated FLINK-2699:
---
Description:
I think you guys are using our script for synchronizing JIRA. However, you
[
https://issues.apache.org/jira/browse/FLINK-2699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14804497#comment-14804497
]
Patrick Wendell commented on FLINK-2699:
Great - thanks for cleaning this up. No worries.
> Fl
[
https://issues.apache.org/jira/browse/SPARK-10650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10650:
Description:
In 1.5.0 there are some extra classes in the Spark docs - including a bunch
[
https://issues.apache.org/jira/browse/SPARK-10650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10650:
Priority: Critical (was: Major)
> Spark docs include test and other extra clas
[
https://issues.apache.org/jira/browse/SPARK-10650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10650:
Affects Version/s: 1.5.0
> Spark docs include test and other extra clas
Patrick Wendell created SPARK-10650:
---
Summary: Spark docs include test and other extra classes
Key: SPARK-10650
URL: https://issues.apache.org/jira/browse/SPARK-10650
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-10650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10650:
Target Version/s: 1.5.1
> Spark docs include test and other extra clas
[
https://issues.apache.org/jira/browse/SPARK-6942?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-6942:
---
Assignee: Andrew Or (was: Patrick Wendell)
> Umbrella: UI Visualizations for C
Patrick Wendell created SPARK-10620:
---
Summary: Look into whether accumulator mechanism can replace
TaskMetrics
Key: SPARK-10620
URL: https://issues.apache.org/jira/browse/SPARK-10620
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-10620?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10620:
Description:
This task is simply to explore whether the internal representation used
[
https://issues.apache.org/jira/browse/SPARK-10620?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14745690#comment-14745690
]
Patrick Wendell commented on SPARK-10620:
-
/cc [~imranr] and [~srowen] for any comments. In my
[
https://issues.apache.org/jira/browse/SPARK-10511?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10511:
Assignee: Luciano Resende
> Source releases should not include maven j
[
https://issues.apache.org/jira/browse/SPARK-10623?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10623:
Component/s: SQL
> turning on predicate pushdown throws nonsuch element exception when
[
https://issues.apache.org/jira/browse/SPARK-10601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10601:
Component/s: SQL
> Spark SQL - Support for MI
[
https://issues.apache.org/jira/browse/SPARK-10600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-10600:
Component/s: SQL
> SparkSQL - Support for Not Exists in a Correlated Subqu
[
https://issues.apache.org/jira/browse/SPARK-10576?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14742280#comment-14742280
]
Patrick Wendell commented on SPARK-10576:
-
FWIW - seems to me like moving them into /java makes
Patrick Wendell created SPARK-10511:
---
Summary: Source releases should not include maven jars
Key: SPARK-10511
URL: https://issues.apache.org/jira/browse/SPARK-10511
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-10374?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14723792#comment-14723792
]
Patrick Wendell commented on SPARK-10374:
-
Hey Matt,
I think the only thing that could have
[
https://issues.apache.org/jira/browse/SPARK-10359?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14723844#comment-14723844
]
Patrick Wendell commented on SPARK-10359:
-
The approach in SPARK-4123 was a bit different
[
https://issues.apache.org/jira/browse/SPARK-4123?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell resolved SPARK-4123.
Resolution: Duplicate
I've proposed a slightly different approach in SPARK-10359, so I'm
[
https://issues.apache.org/jira/browse/SPARK-9545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell updated SPARK-9545:
---
Summary: Run Maven tests in pull request builder if title has
[test-maven] in it (was: Run
[
https://issues.apache.org/jira/browse/SPARK-9547?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell resolved SPARK-9547.
Resolution: Fixed
Fix Version/s: 1.6.0
Allow testing pull requests with different
[
https://issues.apache.org/jira/browse/SPARK-9545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell resolved SPARK-9545.
Resolution: Fixed
Fix Version/s: 1.6.0
Run Maven tests in pull request builder
Hi All,
For pull requests that modify the build, you can now test different
build permutations as part of the pull request builder. To trigger
these, you add a special phrase to the title of the pull request.
Current options are:
[test-maven] - run tests using maven and not sbt
[test-hadoop1.0]
Patrick Wendell created SPARK-10359:
---
Summary: Enumerate Spark's dependencies in a file and diff against
it for new pull requests
Key: SPARK-10359
URL: https://issues.apache.org/jira/browse/SPARK-10359
There is already code in place that restricts which tests run
depending on which code is modified. However, changes inside of
Spark's core currently require running all dependent tests. If you
have some ideas about how to improve that heuristic, it would be
great.
- Patrick
On Tue, Aug 25, 2015
1 - 100 of 4069 matches
Mail list logo