+1 for Ruby, as Ruby already has a functional-ish collection API, so
mapping the RDD functional transforms over would be pretty idiomatic.
Would love to review this.

On Tue, Oct 15, 2013 at 10:13 AM, Patrick Wendell <pwend...@gmail.com> wrote:
> I think Ruby integration via JRuby would be a great idea.
>
> On Tue, Oct 15, 2013 at 9:45 AM, Ryan Weald <r...@weald.com> wrote:
>> Writing a JRuby wrapper around the existing Java bindings would be pretty
>> cool. Could help to get some of the Ruby community to start using the Spark
>> platform.
>>
>> -Ryan
>>
>>
>> On Mon, Oct 14, 2013 at 12:07 PM, Aaron Babcock 
>> <aaron.babc...@gmail.com>wrote:
>>
>>> Hey Laksh,
>>>
>>> Not sure if you are interested in groovy at all, but I've got the
>>> beginning of a project here:
>>> https://github.com/bunions1/groovy-spark-example
>>>
>>> The idea is to map groovy idioms: myRdd.collect{ row -> newRow } to
>>> spark api calls myRdd.map( row => newRow)  and support a good repl.
>>>
>>> Its not officially related to spark at all and is very early stage but
>>> maybe it will be a point of reference for you.
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Mon, Oct 14, 2013 at 12:42 PM, Laksh Gupta <glaks...@gmail.com> wrote:
>>> > Hi
>>> >
>>> > I am interested in contributing to the project and want to start with
>>> > supporting a new programming language on Spark. I can see that Spark
>>> > already support Java and Python. Would someone provide me some
>>> > suggestion/references to start with? I think this would be a great
>>> learning
>>> > experince for me. Thank you in advance.
>>> >
>>> > --
>>> > - Laksh Gupta
>>>



-- 
--
Evan Chan
Staff Engineer
e...@ooyala.com  |

Reply via email to