;>
>>view my Linkedin profile
>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which
sclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Sat, 24 Jul 2021 at 13:46, Dinakar Chennubotla <
> chennu.bigd...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am Din
Sat, 24 Jul 2021 at 13:46, Dinakar Chennubotla <
> chennu.bigd...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am Dinakar, Hadoop admin,
>> could someone help me here,
>>
>> 1. I have a DEV-POC task to do,
>> 2. Need to Installing Distributed apache-sp
g from
such loss, damage or destruction.
On Sat, 24 Jul 2021 at 13:46, Dinakar Chennubotla
wrote:
> Hi All,
>
> I am Dinakar, Hadoop admin,
> could someone help me here,
>
> 1. I have a DEV-POC task to do,
> 2. Need to Installing Distributed apache-spark cluster wi
build Distributed
> apache-spark cluster, using yarn or apache mesos.
>
> Sending you, my initial sketch. pictorial representation on the same.
>
> Could you help me with the below:
> ==
> As per the Diagram,
> 1. I have to write Dockerfiles with Apache
d scale down ( I think this is how
>> Databricks autoscaling works..). I am not sure k8s TBH, perhaps it's handled
>> this more gracefully
>>
>>
>> On Sat, Jul 24, 2021 at 3:38 PM Dinakar Chennubotla <
>> chennu.bigd...@gmail.com> wrote:
>>
>>> Hi Khal
Databricks autoscaling works..). I am not sure k8s TBH, perhaps it's
handled this more gracefully
On Sat, Jul 24, 2021 at 3:38 PM Dinakar Chennubotla <
chennu.bigd...@gmail.com> wrote:
> Hi Khalid Mammadov,
>
> Thank you for your response,
> Yes, I did, I built standalone ap
Hi Khalid Mammadov,
Thank you for your response,
Yes, I did, I built standalone apache spark cluster on docker containers.
But I am looking for distributed spark cluster,
Where spark workers are scalable and spark "deployment mode = cluster".
Source url I used to built standalone ap
1. I have a DEV-POC task to do,
> 2. Need to Installing Distributed apache-spark cluster with Cluster mode
> on Docker containers.
> 3. with Scalable spark-worker containers.
> 4. we have a 9 node cluster with some other services or tools.
>
> Thanks,
> Dinakar
>
Hi All,
I am Dinakar, Hadoop admin,
could someone help me here,
1. I have a DEV-POC task to do,
2. Need to Installing Distributed apache-spark cluster with Cluster mode on
Docker containers.
3. with Scalable spark-worker containers.
4. we have a 9 node cluster with some other services or tools
We try to create a cluster which consists of 4 machines. The cluster will
be used by multiple-users. How can we configured that user can submit jobs
from personal computer and is there any free tool you can suggest to
leverage procedure.
--
Uğur Sopaoğlu
11 matches
Mail list logo