Dear Sujit,
Since you are senior with Spark, I might not know whether it is convenient for
you to help comment some on my dilemma
while using spark to deal with R background application ...
Thank you very much!Zhiliang
On Tuesday, September 22, 2015 1:45 AM, Zhiliang Zhu
RDD is a set of data rows (in your case numbers), there is no meaning for
the order of the items.
What exactly are you trying to accomplish?
*Romi Kuntsman*, *Big Data Engineer*
http://www.totango.com
On Mon, Sep 21, 2015 at 2:29 PM, Zhiliang Zhu
wrote:
> Dear ,
>
Hi Romi,
Thanks very much for your kind help comment~~
In fact there is some valid backgroud of the application, it is about R data
analysis #fund_nav_daily is a M X N (or M X 1) matrix or data.frame, col is
each daily fund return, row is the daily date#fund_return_daily needs to count
the
Hi Zhiliang,
Would something like this work?
val rdd2 = rdd1.sliding(2).map(v => v(1) - v(0))
-sujit
On Mon, Sep 21, 2015 at 7:58 AM, Zhiliang Zhu
wrote:
> Hi Romi,
>
> Thanks very much for your kind help comment~~
>
> In fact there is some valid backgroud of
Hi Romi,
I must show my sincere appreciation towards your kind & helpful help.
One more question, currently I am using spark to deal with financial data
analysis, so lots of operations on R data.frame/matrix and stat/regressionare
always called.However, SparkR currently is not that strong, most
Hi Sujit,
I must appreciate your kind help very much~
It seems to be OK, however, do you know the corresponding spark Java API
achievement...Is there any java API as scala sliding, and it seemed that I do
not find spark scala's doc about sliding ...
Thank you very much~Zhiliang
On
Hi Sujit,
Thanks very much for your kind help.I have found the sliding doc in both scala
and java spark, it is from mlib RDDFunctions, though in the doc there is always
not enough example.
Best Regards,Zhiliang
On Monday, September 21, 2015 11:48 PM, Sujit Pal
Hi Zhiliang,
Haven't used the Java API but found this Javadoc page, may be helpful to
you.
https://spark.apache.org/docs/1.3.1/api/java/org/apache/spark/mllib/rdd/RDDFunctions.html
I think the equivalent Java code snippet might go something like this:
RDDFunctions.fromRDD(rdd1,