New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use PyUnicode_AsWideCharString() instead of PyUnicode_AsUnicode() #66520
Comments
I would like to deprecate PyUnicode_AsUnicode(), see the issue bpo-22271 for the rationale (hint: memory footprint). To deprecate PyUnicode_AsUnicode(), we should stop using it internally. The attached patch is a work-in-progress patch, untested on Windows (only tested on Linux). It gives an idea of how many files should be modified. TODO:
|
wchar_posixmodule.patch: patch for posixmodule.c. Sorry, the code calling PyUnicode_AsUnicode() was not generated by Argument Clinic in fact. |
Will not this cause performance regression? When we hardly work with wchar_t-based API, it looks good to cache encoded value. |
Yes, it will be slower. But I prefer slower code with a lower memory footprint. On UNIX, I don't think that anyone will notice the difference. My concern is that the cache is never released. If the conversion is only needed once at startup, the memory will stay until Python exits. It's not really efficient. On Windows, conversion to wchar_t* is common because Python uses the Windows wide character API ("W" API vs "A" ANSI code page API). For example, most access to the filesystem use wchar_t* type. On Python < 3.3, Python was compiled in narrow mode and so Unicode was already using wchar_t* internally to store characters. Since Python 3.3, Python uses a more compact representation. wchar_t* shares Unicode data only if sizeof(wchar_t*) == KIND where KIND is 1, 2 or 4 bytes per character. Examples: "\u20ac" on Windows (16 bits wchar_t) or "\U0010ffff" on Linux (32 bits wchar_t) . |
The cache is released when the string is released. While the string exists it's wchar_t representation can be needed again. That is for what the cache exists. |
I know. But I don't want to waste memory for this cache. I want to stop using it. IMO the performance overhead will be null. In which use case do you think that the overhead of not using the cache would be important enough? |
I suppose in the same use case that memory overhead of using the cache is important enough. We need results of performance and memory consumption effect of these changes in a wide range of programs. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: