So we definitely need to be careful here.  I know you didn't mention it but it 
mentioned by others so I would not recommend using LimitedPrivate.  I had 
started a discussion on Hadoop about some of this due to the way Spark needed 
to use some of the Api's.https://issues.apache.org/jira/browse/HADOOP-10506

Overall it seems like a good idea, but we definitely need definitions with 
these and make sure they are clear to the end user looking at the code or docs.
I assume Developer really means to be used only within Spark? Developer is a 
pretty broad term which could mean end user developer or spark internal 
developer, etc.  Hadoop uses Private for this I think from an end user point of 
view PRIVATE is more obvious that they shouldn't be using it. So perhaps 
something other then Developer.  (INTERNAL, PROJECT_PRIVATE, etc.)
Tom
 

    On Thursday, May 12, 2016 4:29 PM, Reynold Xin <r...@databricks.com> wrote:
 

 We currently have three levels of interface annotation:
- unannotated: stable public API- DeveloperApi: A lower-level, unstable API 
intended for developers.- Experimental: An experimental user-facing API.

After using this annotation for ~ 2 years, I would like to propose the 
following changes:
1. Require explicitly annotation for public APIs. This reduces the chance of us 
accidentally exposing private APIs.
2. Separate interface annotation into two components: one that describes 
intended audience, and the other that describes stability, similar to what 
Hadoop does. This allows us to define "low level" APIs that are stable, e.g. 
the data source API (I'd argue this is the API that should be more stable than 
end-user-facing APIs).
InterfaceAudience: Public, Developer
InterfaceStability: Stable, Experimental

What do you think?

  

Reply via email to