Hi Arunkumar,
sparkR has very limited functionality than R , and few of datatypes like 'data
table' in R is not there sparkR . So you need to be check compatibility of you
R code carefully with sparkR.
Regards,
Saurabh
-Original Message-
From: mylistt...@gmail.com [mailto:mylistt...@gmail.com]
Sent: Tuesday, May 31, 2016 6:35 PM
To: Arunkumar Pillai <arunkumar1...@gmail.com>
Cc: user <user@spark.apache.org>
Subject: Re: Running R codes in sparkR
Hi Arunkumar ,
Yes , R can be integrated with Spark to give you SparkR. There are a couple of
blogs on the net. The Spark dev page has it too.
https://spark.apache.org/docs/latest/sparkr.html
Just remember that all packages of R that you may have worked on in R are not
supported in SparkR. There are a good set of R packages in SparkR.
As I have understood you cannot run sapply etc for example. The constraint
being these packages need to be ported/coded for RDD's. The R community as I
understand is not very deeply involved with the Spark community. - this I have
understood by seeing you tube videos.
On May 31, 2016, at 18:16, Arunkumar Pillai <arunkumar1...@gmail.com> wrote:
> Hi
>
> I have some basic doubt regarding spark R.
>
> 1. can we run R codes in spark using sparkR or some spark functionalities
> that are executed in spark through R.
>
>
>
> --
> Thanks and Regards
>Arun
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org