[ https://issues.apache.org/jira/browse/SPARK-26601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
zhoukang updated SPARK-26601: ----------------------------- Summary: Make broadcast-exchange thread pool keepalivetime and maxThreadNumber configurable (was: Make broadcast-exchange thread pool keepalivetime configurable) > Make broadcast-exchange thread pool keepalivetime and maxThreadNumber > configurable > ---------------------------------------------------------------------------------- > > Key: SPARK-26601 > URL: https://issues.apache.org/jira/browse/SPARK-26601 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.4.0 > Reporter: zhoukang > Priority: Major > Attachments: 选区_001.png, 选区_002 (1).png, 选区_002.png > > > Currently,thread number of broadcast-exchange thread pool is fixed and > keepAliveSeconds is also fixed as 60s. > {code:java} > object BroadcastExchangeExec { > private[execution] val executionContext = > ExecutionContext.fromExecutorService( > ThreadUtils.newDaemonCachedThreadPool("broadcast-exchange", 128)) > } > /** > * Create a cached thread pool whose max number of threads is > `maxThreadNumber`. Thread names > * are formatted as prefix-ID, where ID is a unique, sequentially assigned > integer. > */ > def newDaemonCachedThreadPool( > prefix: String, maxThreadNumber: Int, keepAliveSeconds: Int = 60): > ThreadPoolExecutor = { > val threadFactory = namedThreadFactory(prefix) > val threadPool = new ThreadPoolExecutor( > maxThreadNumber, // corePoolSize: the max number of threads to create > before queuing the tasks > maxThreadNumber, // maximumPoolSize: because we use > LinkedBlockingDeque, this one is not used > keepAliveSeconds, > TimeUnit.SECONDS, > new LinkedBlockingQueue[Runnable], > threadFactory) > threadPool.allowCoreThreadTimeOut(true) > threadPool > } > {code} > But some times, if the Thead object do not GC quickly it may caused > server(driver) OOM. > Below is an example: -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org