Hi, I have been told Spark in Local mode is simplest for testing. Spark document covers little on local mode except the cores used in --master local[k]. Where are the the driver program, executor and resources. Do I need to start worker threads and how many app I can use safely without exceeding memory allocated etc? Thanking you
- Running Spark in local mode Ashok Kumar
- Re: Running Spark in local mode Takeshi Yamamuro
- Re: Running Spark in local mode Ashok Kumar
- Re: Running Spark in local mode Takeshi Yamamuro
- Re: Running Spark in local mode Mich Talebzadeh
- Re: Running Spark in local mode Jacek Laskowski
- Re: Running Spark in local mode Mich Talebzadeh
- Re: Running Spark in local ... Jonathan Kelly
- Re: Running Spark in local ... Mich Talebzadeh
- Re: Running Spark in local ... Ashok Kumar