Yes - only new or internal API's. I doubt we'd break any exposed APIs for
the purpose of clean up.

Patrick
On Mar 5, 2015 12:16 AM, "Mridul Muralidharan" <mri...@gmail.com> wrote:

> While I dont have any strong opinions about how we handle enum's
> either way in spark, I assume the discussion is targetted at (new) api
> being designed in spark.
> Rewiring what we already have exposed will lead to incompatible api
> change (StorageLevel for example, is in 1.0).
>
> Regards,
> Mridul
>
> On Wed, Mar 4, 2015 at 11:45 PM, Aaron Davidson <ilike...@gmail.com>
> wrote:
> > That's kinda annoying, but it's just a little extra boilerplate. Can you
> > call it as StorageLevel.DiskOnly() from Java? Would it also work if they
> > were case classes with empty constructors, without the field?
> >
> > On Wed, Mar 4, 2015 at 11:35 PM, Xiangrui Meng <men...@gmail.com> wrote:
> >
> >> `case object` inside an `object` doesn't show up in Java. This is the
> >> minimal code I found to make everything show up correctly in both
> >> Scala and Java:
> >>
> >> sealed abstract class StorageLevel // cannot be a trait
> >>
> >> object StorageLevel {
> >>   private[this] case object _MemoryOnly extends StorageLevel
> >>   final val MemoryOnly: StorageLevel = _MemoryOnly
> >>
> >>   private[this] case object _DiskOnly extends StorageLevel
> >>   final val DiskOnly: StorageLevel = _DiskOnly
> >> }
> >>
> >> On Wed, Mar 4, 2015 at 8:10 PM, Patrick Wendell <pwend...@gmail.com>
> >> wrote:
> >> > I like #4 as well and agree with Aaron's suggestion.
> >> >
> >> > - Patrick
> >> >
> >> > On Wed, Mar 4, 2015 at 6:07 PM, Aaron Davidson <ilike...@gmail.com>
> >> wrote:
> >> >> I'm cool with #4 as well, but make sure we dictate that the values
> >> should
> >> >> be defined within an object with the same name as the enumeration
> (like
> >> we
> >> >> do for StorageLevel). Otherwise we may pollute a higher namespace.
> >> >>
> >> >> e.g. we SHOULD do:
> >> >>
> >> >> trait StorageLevel
> >> >> object StorageLevel {
> >> >>   case object MemoryOnly extends StorageLevel
> >> >>   case object DiskOnly extends StorageLevel
> >> >> }
> >> >>
> >> >> On Wed, Mar 4, 2015 at 5:37 PM, Michael Armbrust <
> >> mich...@databricks.com>
> >> >> wrote:
> >> >>
> >> >>> #4 with a preference for CamelCaseEnums
> >> >>>
> >> >>> On Wed, Mar 4, 2015 at 5:29 PM, Joseph Bradley <
> jos...@databricks.com>
> >> >>> wrote:
> >> >>>
> >> >>> > another vote for #4
> >> >>> > People are already used to adding "()" in Java.
> >> >>> >
> >> >>> >
> >> >>> > On Wed, Mar 4, 2015 at 5:14 PM, Stephen Boesch <java...@gmail.com
> >
> >> >>> wrote:
> >> >>> >
> >> >>> > > #4 but with MemoryOnly (more scala-like)
> >> >>> > >
> >> >>> > > http://docs.scala-lang.org/style/naming-conventions.html
> >> >>> > >
> >> >>> > > Constants, Values, Variable and Methods
> >> >>> > >
> >> >>> > > Constant names should be in upper camel case. That is, if the
> >> member is
> >> >>> > > final, immutable and it belongs to a package object or an
> object,
> >> it
> >> >>> may
> >> >>> > be
> >> >>> > > considered a constant (similar to Java'sstatic final members):
> >> >>> > >
> >> >>> > >
> >> >>> > >    1. object Container {
> >> >>> > >    2.     val MyConstant = ...
> >> >>> > >    3. }
> >> >>> > >
> >> >>> > >
> >> >>> > > 2015-03-04 17:11 GMT-08:00 Xiangrui Meng <men...@gmail.com>:
> >> >>> > >
> >> >>> > > > Hi all,
> >> >>> > > >
> >> >>> > > > There are many places where we use enum-like types in Spark,
> but
> >> in
> >> >>> > > > different ways. Every approach has both pros and cons. I
> wonder
> >> >>> > > > whether there should be an "official" approach for enum-like
> >> types in
> >> >>> > > > Spark.
> >> >>> > > >
> >> >>> > > > 1. Scala's Enumeration (e.g., SchedulingMode, WorkerState,
> etc)
> >> >>> > > >
> >> >>> > > > * All types show up as Enumeration.Value in Java.
> >> >>> > > >
> >> >>> > > >
> >> >>> > >
> >> >>> >
> >> >>>
> >>
> http://spark.apache.org/docs/latest/api/java/org/apache/spark/scheduler/SchedulingMode.html
> >> >>> > > >
> >> >>> > > > 2. Java's Enum (e.g., SaveMode, IOMode)
> >> >>> > > >
> >> >>> > > > * Implementation must be in a Java file.
> >> >>> > > > * Values doesn't show up in the ScalaDoc:
> >> >>> > > >
> >> >>> > > >
> >> >>> > >
> >> >>> >
> >> >>>
> >>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.network.util.IOMode
> >> >>> > > >
> >> >>> > > > 3. Static fields in Java (e.g., TripletFields)
> >> >>> > > >
> >> >>> > > > * Implementation must be in a Java file.
> >> >>> > > > * Doesn't need "()" in Java code.
> >> >>> > > > * Values don't show up in the ScalaDoc:
> >> >>> > > >
> >> >>> > > >
> >> >>> > >
> >> >>> >
> >> >>>
> >>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.graphx.TripletFields
> >> >>> > > >
> >> >>> > > > 4. Objects in Scala. (e.g., StorageLevel)
> >> >>> > > >
> >> >>> > > > * Needs "()" in Java code.
> >> >>> > > > * Values show up in both ScalaDoc and JavaDoc:
> >> >>> > > >
> >> >>> > > >
> >> >>> > >
> >> >>> >
> >> >>>
> >>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.storage.StorageLevel$
> >> >>> > > >
> >> >>> > > >
> >> >>> > >
> >> >>> >
> >> >>>
> >>
> http://spark.apache.org/docs/latest/api/java/org/apache/spark/storage/StorageLevel.html
> >> >>> > > >
> >> >>> > > > It would be great if we have an "official" approach for this
> as
> >> well
> >> >>> > > > as the naming convention for enum-like values ("MEMORY_ONLY"
> or
> >> >>> > > > "MemoryOnly"). Personally, I like 4) with "MEMORY_ONLY". Any
> >> >>> thoughts?
> >> >>> > > >
> >> >>> > > > Best,
> >> >>> > > > Xiangrui
> >> >>> > > >
> >> >>> > > >
> >> ---------------------------------------------------------------------
> >> >>> > > > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >> >>> > > > For additional commands, e-mail: dev-h...@spark.apache.org
> >> >>> > > >
> >> >>> > > >
> >> >>> > >
> >> >>> >
> >> >>>
> >>
>

Reply via email to