[ 
https://issues.apache.org/jira/browse/SPARK-29782?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

tzxxh updated SPARK-29782:
--------------------------
    Attachment:     (was: 44.png)

> spark broadcast can not be destoryed in some versions
> -----------------------------------------------------
>
>                 Key: SPARK-29782
>                 URL: https://issues.apache.org/jira/browse/SPARK-29782
>             Project: Spark
>          Issue Type: Bug
>          Components: Block Manager
>    Affects Versions: 2.3.3, 2.4.1, 2.4.2, 2.4.3, 2.4.4
>            Reporter: tzxxh
>            Priority: Major
>         Attachments: correct version.png, problem versions.png
>
>
> In spark version (2.3.3 , 2.4.1 , 2.4.2 , 2.4.3 , 2.4.4) use 
> Broadcast.destroy() method can not destroy the broadcast data, the driver and 
> executor storage memory in spark ui is continuous increase。
> {code:java}
> //代码占位符
> val batch = Seq(1 to 9999: _*) 
> val strSeq = batch.map(i => s"xxh-$i") 
> val rdd = sc.parallelize(strSeq) 
> rdd.cache() 
> batch.foreach(_ => { 
>   val broc = sc.broadcast(strSeq) 
>   rdd.map(id => broc.value.contains(id)).collect() 
>   broc.destroy() 
> })
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to