Hi Vasili,
It so happens that the entire SparkR code was merged to Apache Spark in a
single pull request. So you can see at once all the required changes in
https://github.com/apache/spark/pull/5096. It's 12,043 lines and took more
than 20 people about a year to write as I understand it.

On Mon, Jun 29, 2015 at 10:33 AM, Vasili I. Galchin <vigalc...@gmail.com>
wrote:

> Shivaram,
>
>     Vis-a-vis Haskell support, I am reading DataFrame.R,
> SparkRBackend*, context.R, et. al., am I headed in the correct
> direction?/ Yes or no, please give more guidance. Thank you.
>
> Kind regards,
>
> Vasili
>
>
>
> On Tue, Jun 23, 2015 at 1:46 PM, Shivaram Venkataraman
> <shiva...@eecs.berkeley.edu> wrote:
> > Every language has its own quirks / features -- so I don't think there
> > exists a document on how to go about doing this for a new language. The
> most
> > related write up I know of is the wiki page on PySpark internals
> > https://cwiki.apache.org/confluence/display/SPARK/PySpark+Internals
> written
> > by Josh Rosen -- It covers some of the issues like closure capture,
> > serialization, JVM communication that you'll need to handle for a new
> > language.
> >
> > Thanks
> > Shivaram
> >
> > On Tue, Jun 23, 2015 at 1:35 PM, Vasili I. Galchin <vigalc...@gmail.com>
> > wrote:
> >>
> >> Hello,
> >>
> >>       I want to add language support for another language(other than
> >> Scala, Java et. al.). Where is documentation that explains to provide
> >> support for a new language?
> >>
> >> Thank you,
> >>
> >> Vasili
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to