Hi Ted,
Thanks for the information. The post seems little different with my
requirement: suppose we defined different functions to do different
streaming work (e.g. 50 functions), i want to test these 50 functions in
the spark shell, and the shell will always throw OOM at the middle of test
(yes,
Hi,
Background: The spark shell will get out of memory error after dealing lots
of spark work.
Is there any method which can reset the spark shell to the startup status?
I tried *:reset*, but it seems not working: i can not create spark
context anymore (some compile error as below) after the
See this recent thread:
http://search-hadoop.com/m/q3RTtFW7iMDkrj61/Spark+shell+oom+subj=java+lang+OutOfMemoryError+PermGen+space
On Jul 16, 2015, at 8:51 PM, Terry Hole hujie.ea...@gmail.com wrote:
Hi,
Background: The spark shell will get out of memory error after dealing lots
of