try Spark pipeRDD's , you can invoke the R script from pipe , push the stuff you want to do on the Rscript stdin, p
On Wed, Jun 29, 2016 at 7:10 PM, Gilad Landau <gilad.lan...@clicktale.com> wrote: > Hello, > > > > I want to use R code as part of spark application (the same way I would do > with Scala/Python). I want to be able to run an R syntax as a map function > on a big Spark dataframe loaded from a parquet file. > > Is this even possible or the only way to use R is as part of RStudio > orchestration of our Spark cluster? > > > > Thanks for the help! > > > > Gilad > > >