[ https://issues.apache.org/jira/browse/SPARK-6335?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14361904#comment-14361904 ]
Sean Owen commented on SPARK-6335: ---------------------------------- I think the current behavior is right, or at least not obviously worse than the alternative. This is a command from the Scala shell, which doesn't specially treat the Spark environment variables. I suppose that someone might as readily expect, or want, it to clear {{sc}} too. > REPL :reset command also removes refs to SparkContext and SQLContext > -------------------------------------------------------------------- > > Key: SPARK-6335 > URL: https://issues.apache.org/jira/browse/SPARK-6335 > Project: Spark > Issue Type: Improvement > Components: Spark Shell > Affects Versions: 1.3.0 > Environment: Ubuntu 14.04 64-bit; spark-1.3.0-bin-hadoop2.4 > Reporter: Marko Bonaci > Priority: Trivial > > I wasn't sure whether to mark it as a bug or an improvement, so I went for > more moderate option, since this is rather trivial, rarely used thing. > Here's the repl printout: > {code:java} > 15/03/14 14:39:38 INFO SparkILoop: Created spark context.. > Spark context available as sc. > 15/03/14 14:39:38 INFO SparkILoop: Created sql context (with Hive support).. > SQL context available as sqlContext. > scala> val x = 8 > x: Int = 8 > scala> :reset > Resetting repl state. > Forgetting this session history: > val x = 8 > Forgetting all expression results and named terms: $intp, sc, sqlContext, x > scala> sc.parallelize(1 to 8) > <console>:8: error: not found: value sc > sc.parallelize(1 to 8) > ^ > scala> :quit > Stopping spark context. > <console>:8: error: not found: value sc > sc.stop() > ^ > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org