[issue34168] RAM consumption too high using concurrent.futures (Python 3.7 / 3.6 )

2018-07-20 Thread Dem
Dem added the comment: It seems that even without the as_completed call it has the same problem. ``` # -*- coding: utf-8 -*- import dns.resolver import concurrent.futures from pprint import pprint from json import json bucket = json.load(open('30_million_strings.json','r')) def

[issue34168] RAM consumption too high using concurrent.futures (Python 3.7 / 3.6 )

2018-07-20 Thread Dem
New submission from Dem : I have a list of 30 million strings, and I want to run a dns query to all of them. I do not understand how this operation can get memory intensive. I would assume that the threads would exit after the job is done, and there is also a timeout of 1 minute as well