Hi, Background: The spark shell will get out of memory error after dealing lots of spark work.
Is there any method which can reset the spark shell to the startup status? I tried "*:reset*", but it seems not working: i can not create spark context anymore (some compile error as below) after the "*:reset*". (I have to restart the shell after OOM to workaround) == Expanded type of tree == TypeRef(TypeSymbol(class $read extends Serializable)) uncaught exception during compilation: java.lang.AssertionError java.lang.AssertionError: assertion failed: Tried to find '$line16' in 'C:\Users\jhu\AppData\Local\Temp\spark-2ad09490-c0c6-41e2-addb-63087ce0ae63' but it is not a directory That entry seems to have slain the compiler. Shall I replayyour session? I can re-run each line except the last one.[y/n] Abandoning crashed session. Thanks! -Terry