On 25 Apr 2020, at 17:12, Robert Oxspring <[email protected]> wrote:

>> I tend to think it does not need any hash but just a lastUpdated
>> track (the most recent file invalidating previous cache, it is enough and
>> faster than any hash computation)
> 
> I’m not totally sure of this. Filtering, for example, is particularly 
> dependant on properties and configuration which aren’t directly linked with a 
> source file change.

Exactly - this is the key limitation.

I think we’re doomed to having to recreate the file each time, because we have 
no way of knowing whether the file has changed until you generate it.

But - there is nothing stopping us, when we detect the file is the same, from 
not updating the original file. This will have the effect of not triggering 
unnecessary downstream changes.

Concrete example: A filter generates a file that is input to a code generator. 
Because the file changes every time, the code generator runs every time, not 
good. But if we detected the file as the same, and didn’t write the filtered 
file, the code generator wouldn’t run, and we would safely avoid an unnecessary 
rebuild.

Regards,
Graham
—

Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to