Yes, Spark in local mode works :)
One tip
If you just start it, then the default settings is one core and 1 GB.
I'm using this func to start spark in local mode to get all cors and max RAM
import multiprocessing
import os
from pyspark.sql import SparkSession
from pyspark import SparkConf,
So I'm not sure I completely follow. Are you asking for a way to change the
limit without having to do the repartition? And your DL software doesn't care
if you got say 30 executors instead of 20? Normally I would expect the number
fo partitions at that point to be 200 (or whatever you set