New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
input() doesn't catch _PyUnicode_AsString() exception; io.StringIO().encoding is None #52503
Comments
I'm getting a "TypeError: bad argument type for built-in operation" on a print() with no arguments. This seems to be a problem in both 3.1 and 3.1.2 (haven't tried 3.1.1). I've narrowed the problem down in a very small demo program that you can run to reproduce the bug. Just do "python3.1 bug.py" and hit <ENTER> at the "prompt:". Removing the doctest call (and calling "foo" directly) doesn't get the error. Also removing the "input" call (and leaving the doctest call in) doesn't get the error. The startup banner on my python3.1 is: I compiled python 3.1.2 with ./configure, make, make altinstall without any options. I'm running ubuntu 9.04 with the 2.6.28-18-generic (32-bit) kernel. |
Confirmed. There's something wrong around the doctest._SpoofOut class. Output: $ ./python issue8256_case.py
prompt:
Traceback (most recent call last):
File "issue8256_case.py", line 13, in <module>
foo()
File "issue8256_case.py", line 7, in foo
print()
TypeError: bad argument type for built-in operation |
The bug is triggered by input, not by print. The exact place is _PyUnicode_AsStringAndSize, where unicode check happens. Then print checks PyError_Occured and catches this error. Either this error should not be raised or should be cleared input finishes. I'd love to provide a patch, but I have no idea, what should be corrected and how. If some would tutor me a little, I would be very happy to learn and code this. |
Right. It does not involve doctest. #
import io, sys
original_stdout = sys.stdout try: |
The problem occurs in line in bltinmodule.c: po = PyUnicode_AsEncodedString(stringpo,
_PyUnicode_AsString(stdout_encoding), NULL); Where _PyUnicode_AsString returns NULL, since stdout_encoding is Py_None and that won't pass PyUnicode_Check in _PyUnicode_AsStringAndSize. To what object can _PyUnicode_AsString be turned and then passed to _PyUnicode_AsStringAndSize? Is there some default 'utf-8' encoding object? |
Whatever the solution to this issue is, it certainly looks like a bug that the return value of that function isn't being checked for errors. |
I have written a small patch, that solves the problem, but is disgusting. Could anyone tell me, how I can get some default encoding from Python internals (I have no idea where to look) and return it inside _PyUnicode_AsStringAndSize? Anyway, now when the error happens inside input, it raises an Exception properly. So now I only need to know, how to correct the bug in an elegant fashion. |
Ok, I have found Py_FileDefaultSystemEncoding and use it, however I had to cast it to (char *), because it's a const char *. Maybe I could do it better? |
I have read, that I shouldn't directly use Py_FileSystemDefaultEncoding and rather use PyUnicode_GetDefaultEncoding, so I have changed the code a little. |
Bump! Is there anything happening about this bug? Is my patch any good or should I try to work on something different? |
Victor, you've been dealing with Python's default encoding lately, care to render an opinion on the correct fix for this bug? @filip: the patch will need a unit test, which will also help with assessing the validity of the fix. |
I'll try to code a small test this evening. |
The patch is wrong: _PyUnicode_AsString(Py_None) should not return "utf8"! I suggest that since PyOS_Readline() write the prompt to stderr, the conversion uses the encoding of stderr. |
Amaury, could you elaborate a little more on this? I am pretty new to all this and I would happily write the patch, if only you could give me some clue on how I should approach this. |
This issue is directly related to issue bpo-6697. The first problem is that the builtin input() function doesn't check that _PyUnicode_AsString() result is not NULL. The second problem is that io.StringIO().encoding is None. I don't understand why it is None whereas it uses utf8 (it calls TextIOWrapper constructor with encodings="utf8" and errors="strict"). I will be difficult to write an unit test because the issue only occurs if stdin and stdout are TTY: input() calls PyOS_Readline(stdin, stdout, prompt). -- @gruszczy: You're patch is just a workaround, not the right fix. The problem should be fixed in input(), not in PyUnicode methods. _PyUnicode_AsString() expects an unicode argument, it should raise an error if the argument is None (and not return a magical value). |
Here is a patch catching the _PyUnicode_AsString() error. input() uses sys.stdout.encoding to encode the prompt to a byte string, but I don't know which encoding should be used if sys.stdout.encoding is None (eg. StringIO() of _io module has no encoding because it stores unicode characters, |
since the prompt is written to stderr, why is sys.stdout.encoding used instead of sys.stderr.encoding? |
amaury> since the prompt is written to stderr, why is sys.stdout.encoding input() calls PyOS_Readline() but PyOS_Readline() has multiple
call_readline() calls rl_callback_handler_install() with the prompt which I don't think that it really matters that the prompt is written to stderr with |
Confirmed in 3.3. |
A patch similar to input_stdout_encoding.patch has been applied to 3.2 and 3.3 for the issue bpo-6697: see changeset 846866aa0eb6. |
input_stdout_none_encoding.patch uses UTF-8 if sys.stdout.encoding is None. |
Replicated this issue on Python 3.3b2. The cause is the 'encoding' and 'errors' attributes on io.StringIO() being None. Doctest replaces sys.stdout with a StringIO subclass. The exception raised is still a TypeError. At this point I'm unsure what the fix should be:
|
Agreed with Amaury. |
Upload new patch that uses encoding and errors from stderr if stdout values are invalid unicode. Includes unit test in test_builtin.py. With this patch I am no longer able to replicate this issue. |
I would fallback to PyFile_WriteObject(prompt, fout, Py_PRINT_RAW) if the stdout has no the encoding attribute or it is not a string. |
Here is a patch. |
Serhiy, your patch looks like a worthwhile improvement because it adds proper error checking and handling. However I suspect this original bug is actually a side effect of bpo-24402. The code in question shouldn’t even be running, because sys.stdout is not the original output file descriptor, and is not a terminal. |
I just recently discovered this myself. In the process of debugging the issue I also noticed the same bug that is now fixed via bpo-24402. While I agree that bpo-24402 mostly mitigates the issue I think this patch is still worthwhile, as the current behavior still leads to cryptic, hard to debug errors. For example (although this is not great code, bear with me...) one could write a stdout wrapper like: >>> class WrappedStream:
... encoding = 'utf8'
... errors = None
... def __getattr__(self, attr):
... return getattr(sys.__stdout__, attr)
...
>>> sys.stdout = WrappedStream()
>>> sys.stdout.fileno()
1
>>> sys.stdout.isatty()
True
>>> sys.stdout.errors
>>> input('Prompt: ')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: bad argument type for built-in operation This still goes down the path for ttys, but because the 'errors' attribute does not defer to the underlying stream it still leads to a hard to debug exception. To be clear, I think the above code *should* break, just not as cryptically. |
Actually, I see now that Serhiy's patch would allow this example to just pass through to the non-interactive fallback. So I take it back that my example should break--I think using the fallback would also be fine. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: