[ https://issues.apache.org/jira/browse/SPARK-4923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14274239#comment-14274239 ]
Patrick Wendell commented on SPARK-4923: ---------------------------------------- Hey All, Sorry this has caused a disruption. As I said in the earlier comment. if anyone on these projects can submit a patch that locks down the visibility in that package and opening up things that are specifically needed, I'm fine to keep publishing it (and do so retro-actively for 1.2). We just need to look closely at what we are exposing because this package currently violates Spark's API policy. Because the Scala repl does not itself offer any kind of API stability, it will be hard for Spark to do same. But I think it's fine to just annotate and expose unstable API's here, provided projects understand the implications of depending on them. Chi - since you guys are probably the heaviest user, would you be willing to take a crack at this? Basically start by making everything private and then go and unlock things that you need as Developer API's. - Patrick > Maven build should keep publishing spark-repl > --------------------------------------------- > > Key: SPARK-4923 > URL: https://issues.apache.org/jira/browse/SPARK-4923 > Project: Spark > Issue Type: Bug > Components: Build, Spark Shell > Affects Versions: 1.2.0 > Reporter: Peng Cheng > Priority: Critical > Labels: shell > Attachments: > SPARK-4923__Maven_build_should_keep_publishing_spark-repl.patch > > Original Estimate: 1h > Remaining Estimate: 1h > > Spark-repl installation and deployment has been discontinued (see > SPARK-3452). But its in the dependency list of a few projects that extends > its initialization process. > Please remove the 'skip' setting in spark-repl and make it an 'official' API > to encourage more platform to integrate with it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org