Hi Arunkumar ,

Yes , R can be integrated with Spark to give you SparkR. There are a couple of 
blogs on the net. The Spark dev page has it too.

https://spark.apache.org/docs/latest/sparkr.html



Just remember that all packages of R that you may have worked on in R are not 
supported in SparkR. There are a good set of R packages in SparkR. 

As I have understood you cannot run sapply etc for example. The constraint 
being these packages need to be ported/coded for RDD's. The R community as I 
understand is not very deeply involved with the Spark community. - this I have 
understood by seeing you tube videos. 





On May 31, 2016, at 18:16, Arunkumar Pillai <arunkumar1...@gmail.com> wrote:

> Hi
> 
> I have some basic doubt regarding spark R.
> 
> 1. can we run R codes in spark using sparkR or some spark functionalities  
> that are executed in spark through R.
> 
> 
> 
> -- 
> Thanks and Regards
>        Arun


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to