Here was latest modification in spork repo:
Mon Dec 1 10:08:19 2014

Not sure if it is being actively maintained.

On Sat, Feb 28, 2015 at 6:26 PM, Qiang Cao <caoqiang...@gmail.com> wrote:

> Thanks for the pointer, Ashish! I was also looking at Spork
> https://github.com/sigmoidanalytics/spork Pig-on-Spark), but wasn't sure
> if that's the right direction.
>
> On Sat, Feb 28, 2015 at 6:36 PM, Ashish Nigam <ashnigamt...@gmail.com>
> wrote:
>
>> You have to call spark-submit from oozie.
>> I used this link to get the idea for my implementation -
>>
>>
>> http://mail-archives.apache.org/mod_mbox/oozie-user/201404.mbox/%3CCAHCsPn-0Grq1rSXrAZu35yy_i4T=fvovdox2ugpcuhkwmjp...@mail.gmail.com%3E
>>
>>
>>
>> On Feb 28, 2015, at 3:25 PM, Qiang Cao <caoqiang...@gmail.com> wrote:
>>
>> Thanks, Ashish! Is Oozie integrated with Spark? I knew it can accommodate
>> some Hadoop jobs.
>>
>>
>> On Sat, Feb 28, 2015 at 6:07 PM, Ashish Nigam <ashnigamt...@gmail.com>
>> wrote:
>>
>>> Qiang,
>>> Did you look at Oozie?
>>> We use oozie to run spark jobs in production.
>>>
>>>
>>> On Feb 28, 2015, at 2:45 PM, Qiang Cao <caoqiang...@gmail.com> wrote:
>>>
>>> Hi Everyone,
>>>
>>> We need to deal with workflows on Spark. In our scenario, each workflow
>>> consists of multiple processing steps. Among different steps, there could
>>> be dependencies.  I'm wondering if there are tools available that can
>>> help us schedule and manage workflows on Spark. I'm looking for something
>>> like pig on Hadoop, but it should fully function on Spark.
>>>
>>> Any suggestion?
>>>
>>> Thanks in advance!
>>>
>>> Qiang
>>>
>>>
>>>
>>
>>
>

Reply via email to