Re: [discuss] making SparkEnv private in Spark 2.0
We use it in executors to get to : a) spark conf (for getting to hadoop config in map doing custom writing of side-files) b) Shuffle manager (to get shuffle reader) Not sure if there are alternative ways to get to these. Regards, Mridul On Wed, Mar 16, 2016 at 2:52 PM, Reynold Xinwrote: > Any objections? Please articulate your use case. SparkEnv is a weird one > because it was documented as "private" but not marked as so in class > visibility. > > * NOTE: This is not intended for external use. This is exposed for Shark > and may be made private > * in a future release. > > > I do see Hive using it to get the config variable. That can probably be > propagated through other means. > > > - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: [discuss] making SparkEnv private in Spark 2.0
On Wed, Mar 16, 2016 at 3:29 PM, Mridul Muralidharanwrote: > b) Shuffle manager (to get shuffle reader) > What's the use case for shuffle manager/reader? This seems like using super internal APIs in applications.
[discuss] making SparkEnv private in Spark 2.0
Any objections? Please articulate your use case. SparkEnv is a weird one because it was documented as "private" but not marked as so in class visibility. * NOTE: This is not intended for external use. This is exposed for Shark and may be made private * in a future release. I do see Hive using it to get the config variable. That can probably be propagated through other means.
Re: [discuss] making SparkEnv private in Spark 2.0
We have custom join's that leverage it. It is used to get to direct shuffle'ed iterator - without needing sort/aggregate/etc. IIRC the only way to get to it from ShuffleHandle is via shuffle manager. Regards, Mridul On Wed, Mar 16, 2016 at 3:36 PM, Reynold Xinwrote: > > On Wed, Mar 16, 2016 at 3:29 PM, Mridul Muralidharan > wrote: >> >> b) Shuffle manager (to get shuffle reader) > > > What's the use case for shuffle manager/reader? This seems like using super > internal APIs in applications. > > - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org