Can we convert scala.collection.ArrayBuffer[(Int,Double)] to org.spark.RDD[(Int,Double])

2014-03-30 Thread yh18190
Hi,

Can we convert directly scala collection to spark RDD data type without
using parellize method?
Is their any way to create custom converted RDD datatype from scala type
using some typecast like that?

Please suggest me



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-we-convert-scala-collection-ArrayBuffer-Int-Double-to-org-spark-RDD-Int-Double-tp3486.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Can we convert scala.collection.ArrayBuffer[(Int,Double)] to org.spark.RDD[(Int,Double])

2014-03-30 Thread Mayur Rustagi
The scala object needs to be sent to workers to be used as a RDD,
parallalize is a way to do that. What are you looking to do?
You can serialize the scala object to hdfs/disk & load it from thr
Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi 



On Sun, Mar 30, 2014 at 6:22 AM, yh18190  wrote:

> Hi,
>
> Can we convert directly scala collection to spark RDD data type without
> using parellize method?
> Is their any way to create custom converted RDD datatype from scala type
> using some typecast like that?
>
> Please suggest me
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Can-we-convert-scala-collection-ArrayBuffer-Int-Double-to-org-spark-RDD-Int-Double-tp3486.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>