Hi,
I want to deploy Spark client in a Kubernetes container. Further on , I want to
run the spark job in a Hadoop cluster (meaning the resources of the Hadoop
cluster will be leveraged) but call it from the K8S container. My question is
whether this mode of implementation possible? Do let me
Any feedback please?
Thanks,
Debu
Sent from my iPhone
> On 13-Feb-2020, at 6:36 PM, Debabrata Ghosh wrote:
>
>
> Greetings All !
>
> I have got plenty of application directories lying around sparkStaging , such
> as .sparkStaging/application_1580703507814_0074
>
> Would you please be able
Thanks Vadim!
Sent from my iPhone
> On 10-Oct-2017, at 11:09 PM, Vadim Semenov
> wrote:
>
> Try increasing the `spark.yarn.am.waitTime` parameter, it's by default set to
> 100ms which might not be enough in certain cases.
>
>> On Tue, Oct 10, 2017 at 7:02 AM,
Hi Guys- am not sure whether the email is reaching to the community members.
Please can somebody acknowledge
Sent from my iPhone
> On 30-Sep-2017, at 5:02 PM, Debabrata Ghosh wrote:
>
> Dear All,
>Greetings ! I am repeatedly hitting a