If not set explicitly with spark.default.parallelism, it will default
to the number of cores currently available (minimum 2). At the very
start, some executors haven't completed registering, which I think
explains why it goes up after a short time. (In the case of dynamic
allocation it will change
Hi there,
I have found that if I invoke
sparkContext.defaultParallelism()
too early it will not return the correct value;
For example, if I write this:
final JavaSparkContext sparkContext = new
JavaSparkContext(sparkSession.sparkContext());
final int workerCount =