Thanks Gourav ! I will refer google for this.

Regards,
Vijay Gharge



On Thu, Dec 3, 2015 at 1:26 PM, Gourav Sengupta <gourav.sengu...@gmail.com>
wrote:

> Vijay,
>
> please Google for AWS lambda + S3 there are several used cases available.
> Lambda are event based triggers and are executed when an event occurs in
> Amazon resources (some of them like S3 or Redshift).
>
> Regards,
> Gourav
>
> On Thu, Dec 3, 2015 at 5:15 AM, Vijay Gharge <vijay.gha...@gmail.com>
> wrote:
>
>> Hello Gourav,
>>
>> Can you please elaborate "trigger" part ?
>>
>> Any reference link will be really useful !
>>
>>
>> On Thursday 3 December 2015, Gourav Sengupta <gourav.sengu...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> And so you have the money to keep a SPARK cluster up and running? The
>>> way I make it work is test the code in local system with a localised spark
>>> installation and then create data pipeline triggered by lambda which starts
>>> SPARK cluster and processes the data via SPARK steps and then terminates
>>> having stored the data in S3 or somewhere else if you have VPC and VPN
>>> configured.
>>>
>>> But that approach may be completely wrong when you have money.
>>>
>>>
>>>
>>> Regards,
>>> Gourav
>>>
>>> On Wed, Dec 2, 2015 at 11:35 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>>
>>>> Have you seen this thread ?
>>>>
>>>>
>>>> http://search-hadoop.com/m/q3RTtvmsYMv0tKh2&subj=Re+Upgrading+Spark+in+EC2+clusters
>>>>
>>>> On Wed, Dec 2, 2015 at 2:39 PM, Andy Davidson <
>>>> a...@santacruzintegration.com> wrote:
>>>>
>>>>> I am using spark-1.5.1-bin-hadoop2.6. I used
>>>>> spark-1.5.1-bin-hadoop2.6/ec2/spark-ec2 to create a cluster. Any idea how 
>>>>> I
>>>>> can upgrade to 1.5.2 prebuilt binary?
>>>>>
>>>>> Also if I choose to build the binary, how would I upgrade my cluster?
>>>>>
>>>>> Kind regards
>>>>>
>>>>> Andy
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>> --
>> Regards,
>> Vijay Gharge
>>
>>
>>
>>
>

Reply via email to