unpersist is a method on RDDs. RDDs are abstractions introduced by Spark.

An Int is just a Scala Int. You can't call unpersist on Int in Scala, and
that doesn't change in Spark.

On Fri, Sep 12, 2014 at 12:33 PM, Deep Pradhan <pradhandeep1...@gmail.com>
wrote:

> There is one thing that I am confused about.
> Spark has codes that have been implemented in Scala. Now, can we run any
> Scala code on the Spark framework? What will be the difference in the
> execution of the scala code in normal systems and on Spark?
> The reason for my question is the following:
> I had a variable
> *val temp = <some operations>*
> This temp was being created inside the loop, so as to manually throw it
> out of the cache, every time the loop ends I was calling
> *temp.unpersist()*, this was returning an error saying that *value
> unpersist is not a method of Int*, which means that temp is an Int.
> Can some one explain to me why I was not able to call *unpersist* on
> *temp*?
>
> Thank You
>

Reply via email to