On Thu, Dec 11, 2014 at 2:19 PM, Kepner, Jeremy - 0553 - MITLL < [email protected]> wrote:
> So I think the bigger issue is to revisit the imperative to delete > deprecated functions (both public & private). How many instances have we > had where deleting a deprecated API (public & private) did anything more > than "clean up" the code? > > Personally, I don't think restrictions on private apis should be on the table. We need flexibility to reduce our support burden and to make improvements in the overall system that upend our current internal structures. Often times, that means redoing how pieces work together. If we have to rely on package and private scopes to delineate our support surface it'll drive us to make poorer code organization choices. That doesn't mean we should keep the public API where it is now. We can expand it as needed to make sure we have a reasonable level of assurances for downstream users. For example, it would be really great if we could get to rolling upgrades as a tested thing, at least within minor versions and preferably across adjacent major versions. I'm pretty sure that requires expanding the public API to include RPC. One thing that might make this easier for folks downstream, even without waiting for a new client API, is to introduce annotations for our expected audience and stability levels. This would allow us to mark the things that are public under the current README and also let us flag things that aren't there but are expected integration points (like custom balancers or iterators). Hadoop and HBase both do this[1] and HBase even generates custom javadocs for just those things that are flagged public[2]. [1]: http://hbase.apache.org/book/developing.html#d4209e21495 http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/classification/package-summary.html [2]: http://hbase.apache.org/apidocs/index.html -- Sean
