PySpark cores

2022-07-28 Thread Andrew Melo
Hello, Is there a way to tell Spark that PySpark (arrow) functions use multiple cores? If we have an executor with 8 cores, we would like to have a single PySpark function use all 8 cores instead of having 8 single core python functions run. Thanks! Andrew

Unsubscribe

2022-07-28 Thread Ashish
Unsubscribe Sent from my iPhone - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Unsubscribe

2022-07-28 Thread Karthik Jayaraman

Re: spark can't connect to kafka via sasl_ssl

2022-07-28 Thread wilson
updated: now I have resolved the connection issue (due to wrong arguments passed to sasl). but I meat another problem: 22/07/28 20:17:48 ERROR MicroBatchExecution: Query [id = 2a3bd87a-3a9f-4e54-a697-3d67cef77230, runId = 11c7ca0d-1bd9-4499-a613-6b6e8e8735ca] terminated with error