You really should show your Spark code then. I think you are mistaking
one of the Spark APIs, and are processing a collection of 1
ArrayBuffer at some point, not an ArrayBuffer.

On Wed, Sep 3, 2014 at 6:42 AM, Deep Pradhan <pradhandeep1...@gmail.com> wrote:
> I have a problem here.
> When I run the commands that Rajesh has suggested in Scala REPL, they work
> fine. But, I want to work in a Spark code, where I need to find the number
> of elements in an ArrayBuffer. In Spark code, these things are not working.
> How should I do that?
>
>
> On Wed, Sep 3, 2014 at 10:25 AM, Madabhattula Rajesh Kumar
> <mrajaf...@gmail.com> wrote:
>>
>> Hi Deep,
>>
>> Please find below results of ArrayBuffer in scala REPL
>>
>> scala> import scala.collection.mutable.ArrayBuffer
>> import scala.collection.mutable.ArrayBuffer
>>
>> scala> val a = ArrayBuffer(5,3,1,4)
>> a: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer(5, 3, 1, 4)
>>
>> scala> a.head
>> res2: Int = 5
>>
>> scala> a.tail
>> res3: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer(3, 1, 4)
>>
>> scala> a.length
>> res4: Int = 4
>>
>> Regards,
>> Rajesh
>>
>>
>> On Wed, Sep 3, 2014 at 10:13 AM, Deep Pradhan <pradhandeep1...@gmail.com>
>> wrote:
>>>
>>> Hi,
>>> I have the following ArrayBuffer:
>>>
>>> ArrayBuffer(5,3,1,4)
>>>
>>> Now, I want to get the number of elements in this ArrayBuffer and also
>>> the first element of the ArrayBuffer. I used .length and .size but they are
>>> returning 1 instead of 4.
>>> I also used .head and .last for getting the first and the last element
>>> but they also return the entire ArrayBuffer (ArrayBuffer(5,3,1,4))
>>> What I understand from this is that, the entire ArrayBuffer is stored as
>>> one element.
>>>
>>> How should I go about doing the required things?
>>>
>>> Thank You
>>>
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to