[ https://issues.apache.org/jira/browse/SPARK-17469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15486668#comment-15486668 ]
Christopher MÃ¥rtensson commented on SPARK-17469: ------------------------------------------------ nc -lk 1234 &; ./bin/run-example streaming.StatefulNetworkWordCount localhost 1234 > mapWithState causes block lock warning > -------------------------------------- > > Key: SPARK-17469 > URL: https://issues.apache.org/jira/browse/SPARK-17469 > Project: Spark > Issue Type: Bug > Components: Streaming > Affects Versions: 2.0.0 > Environment: run-example > Reporter: Christopher MÃ¥rtensson > Priority: Minor > > run-example with StatefulNetworkWordCount gives warnings like the following > ------------------------------------------- > Time: 1473416200000 ms > ------------------------------------------- > 16/09/09 12:16:41 WARN Executor: 1 block locks were not released by TID = > 1788: > [rdd_2475_0] > 16/09/09 12:16:41 WARN Executor: 1 block locks were not released by TID = > 1791: > [rdd_2475_3] > 16/09/09 12:16:41 WARN Executor: 1 block locks were not released by TID = > 1790: > [rdd_2475_2] > 16/09/09 12:16:41 WARN Executor: 1 block locks were not released by TID = > 1789: > [rdd_2475_1] > ------------------------------------------- > Time: 1473416201000 ms > ------------------------------------------- > 16/09/09 12:16:42 WARN Executor: 1 block locks were not released by TID = > 1792: > [rdd_2481_0] > 16/09/09 12:16:42 WARN Executor: 1 block locks were not released by TID = > 1794: > [rdd_2481_2] > 16/09/09 12:16:42 WARN Executor: 1 block locks were not released by TID = > 1795: > [rdd_2481_3] > 16/09/09 12:16:42 WARN Executor: 1 block locks were not released by TID = > 1793: > [rdd_2481_1] > This was also reproduced by running any other application using mapWithState. > Only tested in local mode. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org