[ https://issues.apache.org/jira/browse/SPARK-16440?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15383967#comment-15383967 ]
Anthony Truchet edited comment on SPARK-16440 at 7/19/16 11:21 AM: ------------------------------------------------------------------- Thanks for such a quick fix [~srowen] : I was off-line for the past week that's why I couldn't submit the patch quickly enough. I would have {{destroyed}} the variable instead of {{unpersisting}} them though as the issues was memory consumption on the *driver* side: what am I missing which made you choose the later over the former ? was (Author: anthony-truchet): Thanks for such a quick fix [~srowen] : I was off-line for the past week that's why I couldn't submit the patch quickly enough. I would have {{destroy}}ed the variable instead of {{unpersist}}ing them though as the issues was memory consumption on the driver side: what am I missing which made you choose the later over the former ? > Undeleted broadcast variables in Word2Vec causing OoM for long runs > -------------------------------------------------------------------- > > Key: SPARK-16440 > URL: https://issues.apache.org/jira/browse/SPARK-16440 > Project: Spark > Issue Type: Bug > Components: MLlib > Affects Versions: 1.6.0, 1.6.1, 1.6.2, 2.0.0 > Reporter: Anthony Truchet > Assignee: Sean Owen > Fix For: 1.6.3, 2.0.0 > > Original Estimate: 4h > Remaining Estimate: 4h > > Three broadcast variables created at the beginning of {{Word2Vec.fit()}} are > never deleted nor unpersisted. This seems to cause excessive memory > consumption on the driver for a job running hundreds of successive training. > They are > {code} > val expTable = sc.broadcast(createExpTable()) > val bcVocab = sc.broadcast(vocab) > val bcVocabHash = sc.broadcast(vocabHash) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org