Henry Carscadden <hl...@virginia.edu> added the comment:

Hey, Tim, I just wanted a note of clarification. I was working on an 
approximation algorithm for a network science tool that is being released soon. 
Initially, I relied on itertools.product(), but when I moved to testing on 
non-trivial graphs, I immediately had Out of Memory Errors. This was even on 
the nodes with large memories on the computing cluster that I was using. The 
issue is a lot of extra lists are added as the number of iterables passed 
product grows although the actual number of elements in the iterables is 
relatively small. 
Here is the workaround that I used to handle many arguments:

def product(*args) -> Iterable:
    '''
    Takes in several iterables and returns a generator
    that yields the cartesian product of the elements in the iterables
    :param args: sets which you would like the cartesian product of
    :return: generator
    '''
    if len(args) == 1 and type(args) == list:
        return args
    numArgs = len(args)
    indices = np.zeros(numArgs, dtype=np.uint)
    checkLen = len(args[0])
    while indices[0] < checkLen:
        curr_term = [args[0][indices[0]]]
        for i in range(1, numArgs):
            curr_term.append(args[i][indices[i]])
        indices[-1] += np.uint(1)
        for i in range(1, numArgs):
            updated = False
            if indices[-i] >= len(args[-i]):
                indices[-i] = np.uint(0)
                indices[-i - 1] += np.uint(1)
                updated = True
            if not updated:
                break
        yield curr_term

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue40230>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to