My environment is on windows and I see this well using sparkr. It appears the R
Session somehow is killed after period of inactivity and it can't be found
using task manager. I would need to restart the interpreter in order to get
another R Session.
> On Feb 17, 2017, at 9:11 PM, RUSHIKESH RAUT
Hello,
I have been trying to use the SparkR interpreter on Spark 1.6.1 and run
into a problem. The SparkR interpreter works well in stand alone
configuration with master set to local[*] and SPARK_HOME undefined.
With master set to yarn-client (with hadoop 2.6.4) and SPARK_HOME pointing
to the ins