Hello,
I understand we need to specify the 'spark.kubernetes.driver.limit.cores'
and 'spark.kubernetes.executor.limit.cores' config parameters while
submitting spark on k8s namespace with resource quota applied.
There are also other config parameters 'spark.driver.cores' and
'spark.executor.cores
Hello,
We are running Spark 2.4 on Kubernetes cluster, able to access the Spark UI
using "kubectl port-forward".
However, this spark UI contains currently running Spark application logs,
we would like to maintain the 'completed' spark application logs as well.
Could someone help us to setup 'Spar
On Oct 27, 2018 3:34 AM, "karan alang" wrote:
Hello
- is there a "performance" difference when using Java or Scala for Apache
Spark ?
I understand, there are other obvious differences (less code with scala,
easier to focus on logic etc),
but wrt performance - i think there would not be much of a