[ 
https://issues.apache.org/jira/browse/SPARK-4923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14256319#comment-14256319
 ] 

Peng Cheng commented on SPARK-4923:
-----------------------------------

Hey Patrick,

The following API has been integrated since 1.0.0, IMHO they are stable enough 
for daily prototyping, creating case class used to be defective but has been 
fixed long time ago.
SparkILoop.getAddedJars()
$SparkIMain.bind
$SparkIMain.quietBind
$SparkIMain.interpret
end of :)

At first I assume that further development on it has been moved to databricks 
cloud. But the JIRA ticket was already there in September. So maybe demand on 
this API from the community is indeed low enough.
However, I would still suggest keeping it, even promoting it into a Developer's 
API, this would encourage more projects to integrate in a more flexible way, 
and save prototyping/QA cost by customizing fixtures of REPL. People will still 
move to databricks cloud, which has far more features than that. Many 
influential projects already depends on the routinely published Scala-REPL 
(e.g. playFW), it would be strange for Spark not doing the same.
What do you think? 

> Maven build should keep publishing spark-repl
> ---------------------------------------------
>
>                 Key: SPARK-4923
>                 URL: https://issues.apache.org/jira/browse/SPARK-4923
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.2.0
>            Reporter: Peng Cheng
>            Priority: Critical
>              Labels: shell
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> Spark-repl installation and deployment has been discontinued (see 
> SPARK-3452). But its in the dependency list of a few projects that extends 
> its initialization process.
> Please remove the 'skip' setting in spark-repl and make it an 'official' API 
> to encourage more platform to integrate with it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to