I think the ability to handle different collection of inputs is really important, and it would be nice to see if we can go further than a static list of inputs like the source vs header case of a c++ plugin.
In our case (android plugin), we have a task that receives a list of list of files. For various reasons, it's very important that we pass this to the task in this format. However the @InputFiles annotation doesn't work on this (AFAIK*), so we end up with something like this: List<ResourceSet> inputResourceSets // ResourceSet provides a list of Files // fake input to detect changes. Not actually used by the task @InputFiles Iterable<File> getRawInputFolders() { return IncrementalTask.flattenSourceSets(getInputResourceSets()) } (* it occurs to me while writing this maybe we could have used the @Nested annotation, but this class is not specific to the gradle plugin and doesn't have access to the annotation. The rest of my message stands though.) In our incremental support we look for all changed files (using our own mechanism), and then for each changed file we actually go and look for which ResourceSet it belongs to so that the ResourceSet can act on the changed file. So, if a task is going to support multiple input collections, it would be nice to support this dynamically. This means not only having the task define 2+ annotated Iterable<File> fields, but also being able to return a list of "Input File provider", and have the API provide us with the associated provider when being notified of a changed input file. Bonus point if a provider is not declared through an annotation inside the provider class, or through implementing a Gradle interface. I could go with @InputProviders Iterable<Foo> fooList where we expect to find a given named method (not necessarily annotated) on Foo. Again, this is just my (weird?) use case where 80+% of my plugin code is not being compiled against the Gradle API (this is a constant struggle to reuse classes not designed for Gradle in tasks and the DSL) I'm not sure what the receiving API would look like to be honest, I just wanted to add my use case as an example. I do want to mention though that it would be really good to differentiate new file vs changed file. From the different proposed APIs it seemed those were always treated the same. It's easy for a task implementation to treat those two cases the same, but not having the info could mean losing out on some optimization. thanks! On Mon, Apr 15, 2013 at 9:58 PM, Adam Murdoch <adam.murd...@gradleware.com>wrote: > Hi, > > I think there's a deep flaw in this API: For almost every task there are > different collections of inputs files that affect the output files and this > API lumps them all together. Here are some examples: > > - The source files and the compile classpath of a Java compile task. I > might implement this task so that it can deal with changes in the compile > classpath but not the source files, so that it short-circuits compilation > if the API of the compile classpath has not changed (regardless of how its > packaged) but if it does compile anything, it recompiles everything. If a > source file changes, it recompiles all the source. > > - The source files and the header files of a C++ compile task. I might > implement this task so that it can deal with changes to source files but > not header files. If a source file changes, I can recompile the output file > for this source file. If a header file changes, I will recompile everything. > > - The source files and the compiler classpath of a Groovy or Scala compile > task. When the compiler changes, I need to rebuild everything. > > - The source files, the classpath, and the config file of a check style > task. For changes to source files, I can run the analysis for the changed > files. For changes to the config file or the classpath, I need to reanalyse > everything. > > - Unlikely but possible: A file moves from one collection to another > collection, or is added to a second collection or removed from one > collection but not all collections. In these cases, we're going to consider > this file as unchanged, but it has changed role (it hasn't moved, it hasn't > changed content, but its effect on the output is now different). > > - Additional input files declared outside the task via > task.inputs.files(). Changes to these should trigger a rebuild and should > not be passed to the task. > > You get the idea. The API is going to have to allow a task to declare > which collections of input files it can handle changes in, and which > collections it cannot, and we're going to have to deliver those changes on > a per collection basis. We may need this also for output files, but I don't > see a use case for it yet. > > How might this look? Some options: > > - We inject a change set object into the task action for each property > you're interested in observing: > > @InputFile > File configFile > > @InputFiles > Set<File> classpath > > @TaskAction > void doStuff(@ChangesFor('configFile') InputFileChanges configFileChanges, > @ChangesFor('classpath') InputFilesChanges classpathChanges) > { … } > > - We invoke an optional method for each property that you're interested in > observing before we invoke the task action: > > // accepts the changes for the configFile property > void configFileChanges(InputFileChanges changes) { … } > > // accepts the changes for the classpath property > void classpathChanges(InputFilesChanges changes) { … } > > @TaskAction > void doStuff() { > // use whatever state the above methods have kept > } > > We'd probably invoke those methods only if there are changes in the > respective property. The benefit of this approach is that the task has the > opportunity to force a rebuild or skip execution based on certain types of > changes to the value, before the work is started. > > - You query some context passed into the task action: > > @TaskAction > void doStuff(IncrementalTaskContext context) { > def configFileChanges = context.getChangesFor('configFile') > def classpathChanges = context.getChangesFor('classpath') > // need to get all changes before doing anything with them > ... > } > > - A variant of the previous is to split the task action into 2 methods: > one which gathers the changes and makes a decision about what type of > execution is required, and one which does the work (or two: one for > incremental, one for rebuild, or one for cleanup and one to build). > > void beforeAction(IncrementalTaskContext context) { > // decide what to do based on the changes and let Gradle know your > decision > } > > @TaskAction > void doStuff() { > // use whatever state the above methods have kept > } > > - You use some kind of observable type for those properties you want to > know the history for: > > @InputFile > ObservableInputFile configFile > > @InputFiles > ObservableInputFiles classpath > > @TaskAction > void doStuff() { > // can ask configFile if it has changed since last time, etc > } > > The DSL layer would take care of getting the values in and out of these > types. > > > > On 28/03/2013, at 2:43 AM, Daz DeBoer <darrell.deb...@gradleware.com> > wrote: > > G'day > > Now in master is a pretty cool new feature: you can now implement an > 'incremental' task that is informed about exactly which input files > have changed when the task is out of date. > This is very useful for something like a C++ compile task, as it means > that only the changed files need to be recompiled, rather than the > entire set of inputs. > > I've got a 'draft' DSL functioning, and would appreciate any feedback > you guys have. Here's a sample: > > class IncrementalSync extends DefaultTask { > @InputFiles > def FileCollection src > > @OutputDirectory > def File destination > > @TaskAction > void execute(TaskInputChanges inputs) { > if (inputs.allOutOfDate) { > FileUtils.forceDelete(destination) > } > > inputs.outOfDate({ > FileUtils.copyFile(change.file, targetFile(change.file)) > } as Action) > .removed({ > FileUtils.forceDelete(targetFile(change.file)) > } as Action) > .process() > } > > def targetFile(def inputFile) { > new File(destination, change.file.name) > } > } > > Notes: > 1. The way to implement an incremental task is to add a > TaskInputChanges parameter to your @TaskAction method. This must be a > typed parameter, and currently TaskInputChanges is the only parameter > type we support (but there are plans to add more, like > TaskOutputChanges). The reason for using a typed parameter is that > this is the way the task tells us what it wants: I thought about an > annotated parameter, but it seems kind of pointless when the > annotation would imply the type anyway. (Perhaps we can add an > annotation-based marker at a later stage, if it helps). > > 2. There are 2 discrete ways we report incremental changes: > - If the _only_ change to the task execution state is changed input > files, then TaskInputChanges.allOutOfDate() will be false, and only > the added/changed/removed files will be notified to the > TaskInputChanges.outOfDate() and .removed() actions. > - In the case of non-file changes to task inputs (properties, task > class) and changes to task output files, then Gradle will consider all > input files to be out of date. In this case, > TaskInputChanges.allOutOfDate() will be true, and every input file > will be reported to the TaskInputChanges.outOfDate() action. > > 4. The reason for the chained action methods combined with a final > process() method is that this allows us to stream changed inputs in > any order, and does not require us to persist these changes for a > subsequent method call. This is a little awkward, but doesn't force us > to jump through hoops. We could implement a more discrete API on top, > but it may be less efficient. > > 5. I haven't yet got any DSL magic applied to the TaskInputChanges > instance, so using a closure directly isn't (yet) possible. Not sure > how important that is for this DSL, or how tricky it will be to add. > > You can read more about the plans here: > > https://github.com/gradle/gradle/blob/master/design-docs/incremental-build.md > Next steps for incremental tasks include providing access to changed > outputs and properties (in the case a task can handle these more > efficiently), automatically cleaning up stale outputs, and fixing some > bugs around the incremental nature of Copy tasks (and others). > > -- > Darrell (Daz) DeBoer > Principal Engineer, Gradleware > http://www.gradleware.com > Join us at the Gradle Summit 2013, June 13th and 14th in Santa Clara, > CA: http://www.gradlesummit.com > > --------------------------------------------------------------------- > To unsubscribe from this list, please visit: > > http://xircles.codehaus.org/manage_email > > > > > -- > Adam Murdoch > Gradle Co-founder > http://www.gradle.org > VP of Engineering, Gradleware Inc. - Gradle Training, Support, Consulting > http://www.gradleware.com > > Join us at the Gradle Summit 2013, June 13th and 14th in Santa Clara, CA: > http://www.gradlesummit.com > >