This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author tim.peters
Recipients Dennis Sweeney, Henry Carscadden, rhettinger, tim.peters
Date 2020-04-10.16:43:50
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
Henry, no, I see no problem while running your example.  It's been running on my box for over 5 minutes now, and memory use remains trivial.

Note that in the code I posted for you, instead of [1, 2] I used range(100), and instead of 50 I used a million:  the same kind of thing, but I used _far_ larger numbers.  And still didn't have a problem.  Although, yes, that did consume about a gigabyte to materialize a million instances of tuple(range(100)) under the covers.

The code you posted also works fine for me, and with minor memory consumption, if I replace your "50" with "1000000":

>>> many_arguments = [[1,2] for i in range(1000000)]
>>> for term in product(*many_arguments):
...     pass

100% of a CPU is used for as long as I let it run, but memory use jumps just a little at the start.  Which is what I expected.  It just doesn't take all that much memory to create a million 2-tuples - which is what the `product()` implementation does.

I believe you saw a MemoryError, but at this point I have to guess you misdiagnosed the cause.  In any case, if you can't supply an example that reproduces the problem, we're going to have to close this.

Perhaps there's some weird memory-allocation flaw on the _platform_ you're using?  Extreme fragmentation?  Without an example that exhibits a problem, there's just no way to guess from here :-(
Date User Action Args
2020-04-10 16:43:50tim.peterssetrecipients: + tim.peters, rhettinger, Dennis Sweeney, Henry Carscadden
2020-04-10 16:43:50tim.peterssetmessageid: <>
2020-04-10 16:43:50tim.peterslinkissue40230 messages
2020-04-10 16:43:50tim.peterscreate