GitHub user wangzhigang1999 added a comment to the discussion: Seeking Solutions for Spark Context Stopping Without Engine Termination
Right now I’m just giving the engine enough resources — around 8 to 16GB of memory. Ever since I did that, it’s been running pretty smoothly. I’ll think about a cleaner solution later. GitHub link: https://github.com/apache/kyuubi/discussions/6992#discussioncomment-13775648 ---- This is an automatically sent email for [email protected]. To unsubscribe, please send an email to: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
