[ https://issues.apache.org/jira/browse/SPARK-22572?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-22572. ---------------------------------- Resolution: Fixed Fix Version/s: 2.3.0 Issue resolved by pull request 19791 [https://github.com/apache/spark/pull/19791] > spark-shell does not re-initialize on :replay > --------------------------------------------- > > Key: SPARK-22572 > URL: https://issues.apache.org/jira/browse/SPARK-22572 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Affects Versions: 2.3.0 > Reporter: Mark Petruska > Priority: Minor > Fix For: 2.3.0 > > > Spark-shell does not run the re-initialization script when a `:replay` > command is issued: > {code} > $ ./bin/spark-shell > 17/11/21 12:01:00 WARN NativeCodeLoader: Unable to load native-hadoop library > for your platform... using builtin-java classes where applicable > Using Spark's default log4j profile: > org/apache/spark/log4j-defaults.properties > Setting default log level to "WARN". > To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use > setLogLevel(newLevel). > Spark context Web UI available at http://192.168.1.3:4040 > Spark context available as 'sc' (master = local[*], app id = > local-1511262066013). > Spark session available as 'spark'. > Welcome to > ____ __ > / __/__ ___ _____/ /__ > _\ \/ _ \/ _ `/ __/ '_/ > /___/ .__/\_,_/_/ /_/\_\ version 2.3.0-SNAPSHOT > /_/ > > Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_74) > Type in expressions to have them evaluated. > Type :help for more information. > scala> sc > res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext@77bb916f > scala> :replay > Replaying: sc > <console>:12: error: not found: value sc > sc > ^ > scala> sc > <console>:12: error: not found: value sc > sc > ^ > scala> > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org