file.
> >>>
> >>> Is this even possible or the only way to use R is as part of RStudio
> >>> orchestration of our Spark cluster?
> >>>
> >>>
> >>>
> >>> Thanks for the help!
> >>>
> >>>
>
t;
>>> I want to use R code as part of spark application (the same way I would
>>> do with Scala/Python). I want to be able to run an R syntax as a map
>>> function on a big Spark dataframe loaded from a parquet file.
>>>
>>> Is this even possible o
t; .
>
> It's generally recommended to use Python 3 if you're starting a new
> project and don't have old dependencies. But remember that there is still
> quite a lot of stuff that is not yet ported to Python 3.
>
> Regards,
> Saurabh
>
> On Wed, Jun 22, 2016 at 3:20 PM, John A
--
John Aherne
Big Data and SQL Developer
[image: JustEnough Logo]
Cell:
Email:
Skype:
Web:
+1 (303) 809-9718
john.ahe...@justenough.com
john.aherne.je
www.justenough.com
Confidentiality Note: The information contained in this email and
document(s) attached are for the exclusive use