This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Title: Long generator chain causes segmentation fault
Type: crash Stage: resolved
Components: Interpreter Core Versions: Python 3.6
Status: closed Resolution: not a bug
Dependencies: Superseder:
Assigned To: Nosy List: karzes, terry.reedy, tim.peters
Priority: normal Keywords:

Created on 2020-10-02 18:01 by karzes, last changed 2022-04-11 14:59 by admin. This issue is now closed.

File name Uploaded Description Edit karzes, 2020-10-02 18:01 Causes a seg fault for large depth karzes, 2020-10-02 21:09 Causes a seg fault for large depth
Messages (9)
msg377826 - (view) Author: Tom Karzes (karzes) Date: 2020-10-02 18:01
If I create a sufficiently long chain of generators, I encounter a segmentation fault.  For example, the following works as expected:

    % ./ 10000

But for sufficiently larger chain lengths, it seg faults:

    % ./ 20000
    Segmentation fault (core dumped)


    % ./ 100000
    Segmentation fault (core dumped)

The exact point where it seg faults seems to vary slightly between different invocations of Python, but the range is very narrow for a given Python installation.  I believe the difference is due to slight variations in used memory upon startup.

I can't see any reason why this should happen, and in any case, if there is some limit that I'm exceeding, it should raise an exception rather than core dump.

I'm using:

    3.6.9 (default, Jul 17 2020, 12:50:27)
    [GCC 8.4.0]

on a 64-bit Ubuntu Linux system.

Additional info:  A friend of mine is running 3.7.9 on a Windows system.  In his case, the symptom is that the program produces no output for a sufficiently long generator chain (presumably it's silently crashing).  Additionally, he encounters the problem with much shorter generator chains than I do.  I suspect it's the same underlying problem.
msg377832 - (view) Author: Tim Peters (tim.peters) * (Python committer) Date: 2020-10-02 20:01
The docs are already clear about that you play with `setrecursionlimit()` at your own risk:

Set the maximum depth of the Python interpreter stack to limit. This limit prevents infinite recursion from causing an overflow of the C stack and crashing Python.

The highest possible limit is platform-dependent. A user may need to set the limit higher when they have a program that requires deep recursion and a platform that supports a higher limit. This should be done with care, because a too-high limit can lead to a crash.

If you see a crash instead of a `RecursionError` exception when you _don't_ shoot yourself in the foot this way, that might be interesting. But, as is, you're deliberately overriding a mechanism designed to prevent these kinds of crashes.
msg377834 - (view) Author: Tom Karzes (karzes) Date: 2020-10-02 20:22
That is a good point, except I don't believe the value needed to expose this bug is a "too-high limit" (as the documentation calls it).  I set it to 100100 for convenience, but in practice even a value of 17000 is more than enough to expose the bug on my system (it occurs at around 16500).  For my friend using Windows, a value as low as 4000 suffices, which I don't think anyone would argue is unreasonably high.

The default value of 1000 is extremely low, and is intended to catch recursion bugs in programs that are not expected to recurse very deeply.  For a program that genuinely needs recursion, I don't think a value of 20000, or even 100000, is unreasonable given today's typical memory sizes (and when I run my failing case, the memory usage is so low as to be inconsequential).  By my interpretation, these limits should be well within the range that Python can handle.

It seems likely to me that in this case, the problem isn't due to any kind of system limit, but is rather the result of a logical error in the implementation which is somehow exposed by this test.  Hopefully a developer will take advantage of this test case to fix what I believe is a serious bug.
msg377836 - (view) Author: Tim Peters (tim.peters) * (Python committer) Date: 2020-10-02 20:48
There is no way in portable ANSI C to deduce a "safe" limit. The limits that exist were picked by hand across platforms, to be conservative guesses at what would "never" break.

You're allowed to increase the limit if you think you know better - and you may! No two platforms are the same. But - again - you do so at your own risk.  There is no bug here unless you can provoke a crash _without_ overriding the default limit.

I was there when the limit was introduced. It was introduced precisely to avoid the vast array of strange system failures that can occur when the C stack overflows - and, again, there is no portable way to detect that "it's getting close" in ANSI C.  Stack sizes provided by platform C implementations are all over the place, from kilobytes to megabytes, and can also be different yet again for threads, and there is no portable way in C to find out what they are.

For my friend using Windows, a value as low as 4000 suffices, which I don't think anyone would argue is unreasonably high.

For the defaults provided by Microsoft's Visual C compiler (which CPython uses), it is unreasonably high - the C stack overflows.  In fact, under the released Python 3.8.5 on my 64-bit Windows, the largest value for which your program isn't obviously broken is about 2512.  If I don't override the default recursion limit on Windows, I cannot provoke a problem with your program on Windows (instead it dies - as designed and intended - with a `RecursionError` exception instead).
msg377837 - (view) Author: Tom Karzes (karzes) Date: 2020-10-02 21:09
I tested this some more, and one thing became clear that I hadn't realized before:  This bug has nothing to do specifically with generators (as I had thought), but is in fact due purely to the recursion limit.

I created a recursive test program that doesn't use generators at all, and ran into the exact same problem (at only a slightly higher recursion level).  I am attaching that test case as  Here's a sample failing invocation:

    % ./ 20000
    Segmentation fault (core dumped)

On my system, the cutoff point for this one seems to be around 17600, roughly 1100 higher than for the generator example.

What surprises me about this is the Python implementation uses recursion to implement recursion in Python apps.  I would have thought that it would use heap memory for something like this, and as a result only require a fixed amount of stack.  That would clearly have major advantages, since it would decouple recursion limits in Python code from stack limits on the platform.

On the other hand, now that I understand that this is in fact the result of a system limit, I was *very* easily able to work around the problem!  I simply disabled the stack limit.  From csh:

    % unlimit stacksize


    % ./ 100000


    % ./ 100000

So you were right, this was due solely to the default stack limit on my system, and not a bug in the Python implementation.  And it was also very easy to remove that limit.  Hopefully something similar can be done on Windows (which I know very little about).

Thank you for your help!
msg377848 - (view) Author: Terry J. Reedy (terry.reedy) * (Python committer) Date: 2020-10-03 02:05
Another "We are not responsible, proceed at your own risk" operation is importing ctypes, which allows one to overwrite bytes in the running python.
msg377849 - (view) Author: Tim Peters (tim.peters) * (Python committer) Date: 2020-10-03 03:31
Right, generators played no essential role here. Just one way of piling up a tall tower of C stack frames.

Search the web for "stackless Python" for the history of attempts to divorce the CPython implementation from the platform C stack.

There are ways to increase the main thread's stack size on Windows too using MSVC, but unless someone is extremely knowledgeable and willing to write some assembler to help out, they need to change it via a Windows linker option when the .exe is built, or via the EDITBIN developer tool (to modify the .exe file).
msg377852 - (view) Author: Tom Karzes (karzes) Date: 2020-10-03 05:10
Thanks Tim and Terry.  Stackless Python sounds interesting.  It's nice to know that others had the same idea I did, although I tend to shy away from exotic variants since they tend to be less well-supported.  Any chance that CPython will go stackless at some point in the future?

By the way, I saw that I can increase the process stack limit from within a Python app:

    resource.setrlimit(resource.RLIMIT_STACK, (resource.RLIM_INFINITY,

I tried it, and it works (on my Linux system), but of course is unavailable on Windows systems.
msg377990 - (view) Author: Tim Peters (tim.peters) * (Python committer) Date: 2020-10-05 02:01
"Stackless" is a large topic with a convoluted history. Do the web search. In short, no, it will never go in the core - too disruptive to too many things. Parts have lived on in other ways, watered down versions. The PyPy project captured most of what remains, as optional features. CPython's generators captured the easiest part: after a yield, the C stack space the generator consumed is released, while the Python VM stack space lives on in the heap awaiting possible resumption (but, if/when it's resumed, C stack space is required again).
Date User Action Args
2022-04-11 14:59:36adminsetgithub: 86078
2020-10-05 02:01:18tim.peterssetmessages: + msg377990
2020-10-03 05:10:32karzessetmessages: + msg377852
2020-10-03 03:31:29tim.peterssetmessages: + msg377849
2020-10-03 02:05:02terry.reedysetstatus: open -> closed

nosy: + terry.reedy
messages: + msg377848

resolution: not a bug
stage: resolved
2020-10-02 21:09:58karzessetfiles: +

messages: + msg377837
2020-10-02 20:48:44tim.peterssetmessages: + msg377836
2020-10-02 20:22:14karzessetmessages: + msg377834
2020-10-02 20:01:37tim.peterssetnosy: + tim.peters
messages: + msg377832
2020-10-02 18:01:25karzescreate