Jey - just checking if you are listing what the project needs - or if you would 
be able to participate to build these?

-viral



> On 23-May-2015, at 7:54 am, Jey Kottalam <j...@cs.berkeley.edu> wrote:
> 
> The core functionality is there, but there's also more to be
> implemented to have a complete interface to spark-core:
> 
> - distributed I/O
> - broadcast vars
> - accumulator vars
> - custom partitioners
> - persistence (caching)
> - missing transforms and actions
> 
> -Jey
> 
> On Fri, May 22, 2015 at 12:46 AM, Jeff Waller <truth...@gmail.com> wrote:
>> 
>> 
>> On Thursday, May 21, 2015 at 4:55:16 PM UTC-4, Jey Kottalam wrote:
>>> 
>>> Hi Jeff,
>>> 
>>>> they relied on a 3rd party to containerize  a Pythonprogram for
>>>> transmission
>>> 
>>> That is due to the pecularities of Python's serialization module than
>>> anything intrinsic to creating a Spark binding. (E.g. Python's pickle
>>> format doesn't have support for serializing code and closures, so some
>>> extra code was required.) This isn't an issue in Julia since
>>> Base.serialize() already has the needed functionality. An initial
>>> implementation of a Spark binding done in the same style as PySpark is
>>> available at http://github.com/jey/Spock.jl
>>> 
>>> -Jey
>> 
>> 
>> Hey awesome.  Initial you say, what's missing?

Reply via email to