Thanks Martin.
I was not clear in my question initially . Thanks for understanding and
briefing.
The idea as you said is to explore the possibility of using yarn for
cluster scheduling with spark being used without hdfs. Thanks again for
clarification.
On Sat, Jul 11, 2020 at 1:27 PM Juan
Hi Diwakar,
A Yarn cluster not having Hadoop is kind of a fuzzy concept.
Definitely you may want to have Hadoop and don't need to use MapReduce and use
Spark instead. That is the main reason to use Spark in a Hadoop cluster anyway.
On the other hand it is highly probable you may want to use
Hi ,
Could it be possible to setup Spark within Yarn cluster which may not have
Hadoop?.
Thanks.