Using spark-shell, I was not allowed to define the map() without declaring
t first:

scala> rdd = rdd.map(x => x*t)
<console>:26: error: not found: value t
       rdd = rdd.map(x => x*t)
                            ^

On Mon, May 9, 2016 at 4:19 AM, Daniel Haviv <
daniel.ha...@veracity-group.com> wrote:

> How come that for the first() function it calculates an updated value and
> for collect it doesn't ?
>
>
>
> On Sun, May 8, 2016 at 4:17 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> I don't think so.
>> RDD is immutable.
>>
>> > On May 8, 2016, at 2:14 AM, Sisyphuss <zhengwend...@gmail.com> wrote:
>> >
>> > <http://apache-spark-user-list.1001560.n3.nabble.com/file/n26898/09.png
>> >
>> >
>> >
>> >
>> > --
>> > View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-a-bug-tp26898.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: user-h...@spark.apache.org
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to