Merging another email from Prasad. It could co-exist with livy. Livy is
similar like the REST Service + Spark Operator. Unfortunately Livy is not
very active right now.

To Amihay, the link is: https://github.com/datapunchorg/punch.

On Tue, Feb 22, 2022 at 8:53 PM amihay gonen <agone...@gmail.com> wrote:

> Can you share link to the source?
>
> בתאריך יום ד׳, 23 בפבר׳ 2022, 6:52, מאת bo yang ‏<bobyan...@gmail.com>:
>
>> We do not have SaaS yet. Now it is an open source project we build in our
>> part time , and we welcome more people working together on that.
>>
>> You could specify cluster size (EC2 instance type and number of
>> instances) and run it for 1 hour. Then you could run one click command to
>> destroy the cluster. It is possible to merge these steps as well, and
>> provide a "serverless" experience. That is in our TODO list :)
>>
>>
>> On Tue, Feb 22, 2022 at 8:36 PM Bitfox <bit...@bitfox.top> wrote:
>>
>>> How can I specify the cluster memory and cores?
>>> For instance, I want to run a job with 16 cores and 300 GB memory for
>>> about 1 hour. Do you have the SaaS solution for this? I can pay as I did.
>>>
>>> Thanks
>>>
>>> On Wed, Feb 23, 2022 at 12:21 PM bo yang <bobyan...@gmail.com> wrote:
>>>
>>>> It is not a standalone spark cluster. In some details, it deploys a
>>>> Spark Operator (
>>>> https://github.com/GoogleCloudPlatform/spark-on-k8s-operator) and an
>>>> extra REST Service. When people submit Spark application to that REST
>>>> Service, the REST Service will create a CRD inside the Kubernetes cluster.
>>>> Then Spark Operator will pick up the CRD and launch the Spark application.
>>>> The one click tool intends to hide these details, so people could just
>>>> submit Spark and do not need to deal with too many deployment details.
>>>>
>>>> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bit...@bitfox.top> wrote:
>>>>
>>>>> Can it be a cluster installation of spark? or just the standalone node?
>>>>>
>>>>> Thanks
>>>>>
>>>>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bobyan...@gmail.com> wrote:
>>>>>
>>>>>> Hi Spark Community,
>>>>>>
>>>>>> We built an open source tool to deploy and run Spark on Kubernetes
>>>>>> with a one click command. For example, on AWS, it could automatically
>>>>>> create an EKS cluster, node group, NGINX ingress, and Spark Operator. 
>>>>>> Then
>>>>>> you will be able to use curl or a CLI tool to submit Spark application.
>>>>>> After the deployment, you could also install Uber Remote Shuffle Service 
>>>>>> to
>>>>>> enable Dynamic Allocation on Kuberentes.
>>>>>>
>>>>>> Anyone interested in using or working together on such a tool?
>>>>>>
>>>>>> Thanks,
>>>>>> Bo
>>>>>>
>>>>>>

Reply via email to