Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python does not provide PyLong_FromIntMax_t() or PyLong_FromUintMax_t() function #62070

Open
ssbr mannequin opened this issue Apr 29, 2013 · 39 comments
Open

Python does not provide PyLong_FromIntMax_t() or PyLong_FromUintMax_t() function #62070

ssbr mannequin opened this issue Apr 29, 2013 · 39 comments
Labels
interpreter-core (Objects, Python, Grammar, and Parser dirs) topic-C-API type-feature A feature request or enhancement

Comments

@ssbr
Copy link
Mannequin

ssbr mannequin commented Apr 29, 2013

BPO 17870
Nosy @mdickinson, @pitrou, @vstinner, @benjaminp, @ssbr, @serhiy-storchaka, @1st1, @MojoVampire
PRs
  • [WIP] bpo-17870: Add support for C intmax_t type #857
  • convert from long long to PyLong losslessly #1106
  • Files
  • intmax_t.patch
  • use_intmax_t.patch
  • intmax_t-2.patch
  • intmax_t-3.patch
  • intmax_t-4.patch
  • Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

    Show more details

    GitHub fields:

    assignee = None
    closed_at = None
    created_at = <Date 2013-04-29.20:57:04.995>
    labels = ['type-feature']
    title = 'Python does not provide PyLong_FromIntMax_t() or PyLong_FromUintMax_t() function'
    updated_at = <Date 2017-06-15.14:08:50.685>
    user = 'https://github.com/ssbr'

    bugs.python.org fields:

    activity = <Date 2017-06-15.14:08:50.685>
    actor = 'Devin Jeanpierre'
    assignee = 'none'
    closed = False
    closed_date = None
    closer = None
    components = []
    creation = <Date 2013-04-29.20:57:04.995>
    creator = 'Devin Jeanpierre'
    dependencies = []
    files = ['30082', '30085', '30240', '30745', '45641']
    hgrepos = []
    issue_num = 17870
    keywords = ['patch']
    message_count = 39.0
    messages = ['188103', '188105', '188106', '188111', '188112', '188114', '188118', '188126', '188142', '188149', '188168', '188189', '188194', '188196', '188689', '188704', '189051', '189066', '189418', '190693', '190696', '192163', '192182', '229533', '229535', '281706', '281707', '281708', '290684', '296040', '296042', '296043', '296054', '296066', '296089', '296093', '296094', '296095', '296096']
    nosy_count = 8.0
    nosy_names = ['mark.dickinson', 'pitrou', 'vstinner', 'benjamin.peterson', 'Devin Jeanpierre', 'serhiy.storchaka', 'yselivanov', 'josh.r']
    pr_nums = ['857', '1106']
    priority = 'normal'
    resolution = None
    stage = 'resolved'
    status = 'open'
    superseder = None
    type = 'enhancement'
    url = 'https://bugs.python.org/issue17870'
    versions = ['Python 3.5']

    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Apr 29, 2013

    As far as I can tell, the only safe and correct way to convert a (for example) intptr_t to a python long is something akin to the following:

        size_t repsize = sizeof(intmax_t)*8 + 2;
        char i_buf[repsize]; // enough to fit base 2 with sign, let alone base 1
        snprintf(i_buf, repsize, "%ij", (intmax_t) myinteger);
        return PyLong_FromString(i_buf, NULL, 10);

    Does this not seem absurd?

    PyLong_FromIntMax_t(myinteger) would be great. Or maybe even better would be PyLong_FromBytes(&myinteger, sizeof(myinteger)) ?

    This is problematic because many things that can interact with the Python C-API do not use the C types like "long long" or "int". Instead they might use the equivalents of intptr_t and int32_t, which are more reliably sized.

    @vstinner
    Copy link
    Member

    Did you try PyLong_FromVoidPtr()?

    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Apr 29, 2013

    PyLong_FromVoidPtr works for uintptr_t, but not intptr_t.

    @vstinner
    Copy link
    Member

    "PyLong_FromVoidPtr works for uintptr_t, but not intptr_t."

    Ok correct. You should use something like:

    PyObject*
    PyLong_FromIntptr_t(intptr_t value)
    {
        if (sizeof(intptr_t) == sizeof(long))
            return PyLong_FromLong(value);
        else {
            assert(sizeof(intptr_t) <= sizeof(PY_LONG_LONG));
            return PyLong_FromLongLong((PY_LONG_LONG)value);
        }
    }

    The "if (sizeof(intptr_t) == sizeof(long))" should be optimized by your compiler.

    I don't know if Python should provide such function. What is your use case for intptr_t?

    It looks like intptr_t is only really used in the implementation of os.spawnv() and os.spawnve(). Extract:

    #if SIZEOF_LONG == SIZEOF_VOID_P
            return Py_BuildValue("l", (long) spawnval);
    #else
            return Py_BuildValue("L", (PY_LONG_LONG) spawnval);
    #endif

    @vstinner vstinner changed the title Hard to create python longs from arbitrary C integers Python does not provide a PyLong_FromIntptr_t() function Apr 29, 2013
    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Apr 29, 2013

    Isn't it possible for a >64-bit architecture to have intptr_t be wider than long long?

    As for my use-case, I am wrapping the C-API for Rust. Rust can call and be called by C (and therefore Python), but a Rust "int" is a C "intptr_t", and a Rust "uint" is a C "uintptr_t".

    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Apr 29, 2013

    With regards to the title change, I would prefer a FromIntMax_t (and FromUintMax_t) to a FromIntPtr_t. The former covers every use case of the latter, and more.

    @vstinner vstinner changed the title Python does not provide a PyLong_FromIntptr_t() function Python does not provide PyLong_FromIntMax_t() or PyLong_FromUintMax_t() function Apr 29, 2013
    @vstinner
    Copy link
    Member

    The problem is to support platforms not providing a intmax_t type.

    I don't know if the following C code would be a good "approximation" of the maximum integral value (signed/unsigned).

    #ifdef HAVE_INTMAX_T
    typedef intmax_t        Py_intmax_t;
    typedef uintmax_t       Py_uintmax_t;
     
    #elif defined(HAVE_LONG_LONG) && (SIZEOF_VOID_P <= SIZEOF_LONG_LONG)
    typedef PY_LONG_LONG            Py_intmax_t;
    typedef unsigned PY_LONG_LONG   Py_uintmax_t;

    #elif SIZEOF_VOID_P <= SIZEOF_LONG
    typedef long Py_intmax_t;
    typedef unsigned long Py_uintmax_t;

    #elif SIZEOF_VOID_P <= SIZEOF_INT
    typedef int             Py_intmax_t;
    typedef unsigned int    Py_uintmax_t;

    #else
    # error "Python needs a typedef for Py_intmax_t in pyport.h."
    #endif /* HAVE_INTMAX_T */

    If it is not, conversion from/to other types like off_t, time_t, pid_t or uid_t would loose information.

    It is the same question than yours: is there a platform with an integer type wider than a pointer (intptr_t/void*)?

    @serhiy-storchaka
    Copy link
    Member

    It is the same question than yours: is there a platform with an integer type wider than a pointer (intptr_t/void*)?

    It's x86. sizeof(void*) == 4, sizeof(long long) == 8.

    @mdickinson mdickinson added the type-feature A feature request or enhancement label Apr 30, 2013
    @vstinner
    Copy link
    Member

    > It is the same question than yours: is there a platform with an integer type wider than a pointer (intptr_t/void*)?

    It's x86. sizeof(void*) == 4, sizeof(long long) == 8.

    Ah yes. So "SIZEOF_VOID_P <=" is not a good test. File position
    (off_t) size can be 64 bit on a 32-bit system (like Linux/i386 and
    Win32), whereas pointers size is 32 bit.

    I don't think that we can rely on the availability of PY_LONG_LONG, it
    may depend on the compiler or even on the compiler options.

    IMO, if we decide to add functions for intmax_t and uintmax_t types,
    the safest option is to only define functions if the real C type
    (intmax_t and uintmax_t types) is available. I mean, we should not
    guess intmax_t ourself, or we may choose the wrong type (too small
    type).

    @vstinner
    Copy link
    Member

    Or maybe even better would be PyLong_FromBytes(&myinteger, sizeof(myinteger)) ?

    There is a private function:

    obj = _PyLong_FromByteArray((const unsigned char*)&myinteger, sizeof(myinteger), PY_LITTLE_ENDIAN, is_signed);

    Where PY_LITTLE_ENDIAN is 1 on little-endian platform, and is_signed should be 0 or 1.

    This function is inefficient (slow). It would be better to have a functions for intmax_t and uintmax_t types. Such functions would be enough to support other "recent" C types: intptr_t, uintptr_t, ptrdiff_t, etc. We may also add support for these types in PyArg_Parse* (ex: with "j" and "J" types, intmax_t and uintmax_t).

    @vstinner
    Copy link
    Member

    Some Google searchs told me that no CPU support 128-bit integer and
    using 64-bit integer as intmax_t is safe.

    GCC has a __int128 type, but I didn't find on which platform it is
    supported, nor if intmax_t is __int128 in this case.

    Microsoft Visual Studio has stdint.h since its version 2010 (which is
    the version required to compile Python 3.4 on Windows according to the
    Python developer guide).

    I propose a safer definition of Py_intmax_t:

    #ifdef HAVE_UINTMAX_T
    typedef uintmax_t Py_uintmax_t;
    typedef intmax_t Py_intmax_t;
    
    #elif SIZEOF_SIZE_T >= 8
    typedef size_t Py_uintmax_t;
    typedef Py_ssize_t Py_intmax_t;

    #elif defined(HAVE_LONG_LONG) && SIZEOF_LONG_LONG >= 8
    typedef unsigned PY_LONG_LONG Py_uintmax_t;
    typedef PY_LONG_LONG Py_intmax_t;

    #else
    # error "Python needs a typedef for Py_uintmax_t in pyport.h."
    #endif

    I don't think that a fallback on the long type is required, testing
    size_t should be enough.

    At least, the compilation fails if the Py_intmax_t type cannot be defined.

    Having generic PyLong_FromUintMax_t() and PyLong_AsUintMax_t()
    functions (and signed version) would simplify the support of other OS
    types with variable size: time_t, clock_t, pid_t, gid_t, uid_t, off_t,
    etc.

    @vstinner
    Copy link
    Member

    Here is a first patch adding the following functions:

        PyObject* PyLong_FromIntMax_t(intmax_t v);
        PyObject* PyLong_FromUintMax_t(uintmax_t v);
        intmax_t PyLong_AsIntMax_t(PyObject *pylong);
        uintmax_t PyLong_AsUintMax_t(PyObject *pylong);

    I used AC_TYPE_INTMAX_T and AC_TYPE_UINTMAX_T in configure.ac, so intmax_t and uintmax_t are defined if the platform does not provide them. (I guess that these types are defined in pyconfig.h.) On Windows, Visual Studio 2010 supports stdint.h, and so these types are available.

    autoheader && autoconf must be called to regenerate configure script and pyconfig.h.in template.

    @vstinner
    Copy link
    Member

    See also issue bpo-17884.

    @vstinner
    Copy link
    Member

    Another patch to use PyLong_FromIntMax_t().

    @vstinner
    Copy link
    Member

    vstinner commented May 7, 2013

    @mark: any opinion on my patch?

    @mdickinson
    Copy link
    Member

    I'll take a look.

    @mdickinson mdickinson self-assigned this May 8, 2013
    @mdickinson
    Copy link
    Member

    Some comments for the first patch (I haven't really looked at the second):

    • I would much prefer PyLong_AsIntMax_t not to use nb_int; it should work only for instances of 'int' (just as PyLong_AsSsize_t and PyLong_AsSize_t currently do).

    • There's a missing 'versionadded' for PyLong_AsIntMax_t in the docs.

    • Will AC_CHECK_SIZEOF(intmax_t) work on platforms that don't define intmax_t? I don't know whether the #define created by the earlier AC_TYPE_INTMAX_T is available at that point. We'll probably find out from the buildbots.

    • Do we also need an addition to PC/pyconfig.h to define (u)intmax_t and SIZEOF_(U)INTMAX_T on Windows?

    • For the PyLong_As* functions, it may be more efficient to code the conversion directly instead of using _PyLong_AsByteArray.

    • The PyLong_As* functions assume that intmax_t and uintmax_t have no padding bits, no trap representation, and (in the case of intmax_t) use two's complement. I think it's fine to assume all these things, but we should also either document or test those assumptions.

    • The patch lacks tests.

    @vstinner
    Copy link
    Member

    Version 2 of my patch:

    Mark> - I would much prefer PyLong_AsIntMax_t not to use nb_int;
    Mark> it should work only for instances of 'int' (just as
    Mark> PyLong_AsSsize_t and PyLong_AsSize_t currently do)."

    I copied code from PyLong_AsLongLong(), but doc from PyLong_AsLong() :-/

    Some PyLong_As*() functions call __int__(), but not all? It is a little bit surprising to have a different behaviour, but Mark has a longer experience in these APIs and so I trust him :-)

    I changed my code to only accept PyLongObject.

    Mark> There's a missing 'versionadded' for PyLong_AsIntMax_t in the docs.

    fixed

    Mark> Will AC_CHECK_SIZEOF(intmax_t) work on platforms that
    Mark> don't define intmax_t? I don't know whether the #define
    Mark> created by the earlier AC_TYPE_INTMAX_T is available at
    Mark> that point. We'll probably find out from the buildbots.

    I tested with a typo in configure.ac:

    AC_CHECK_SIZEOF(uintmax32_t)

    configure result:

    checking size of uintmax32_t... 0

    pyconfig.h:

    #define SIZEOF_UINTMAX32_T 0

    Should we undefine SIZEOF_UINTMAX32_T (in pyport.h) if its value is zero?

    Mark> Do we also need an addition to PC/pyconfig.h to define (u)intmax_t
    Mark> and SIZEOF_(U)INTMAX_T on Windows?

    Ah yes, I forgot Windows, but I don't have access to a Windows box right now. I modified PC/pyconfig.h, but I cannot test my patch.

    I suppose that intmax_t and uintmax_t don't need to be defined (using typedef) with Visual Studio 2010 or later, since stdint.h is available.

    For the SIZEOF, I chose 64 bits and added a new test in _testcapi (for all platforms). It looks like there is no platform with (hardware) 128 bits integer, and 64-bit on Windows should be correct.

    On Linux 64-bit, __int128 is available, but the size of intmax_t is 64 bits.

    Mark> For the PyLong_As* functions, it may be more efficient to code the conversion directly instead of using _PyLong_AsByteArray.

    I copied code from PyLong_AsLongLong and PyLong_AsUnsignedLongLong. If the code is changed, I would prefer to change the 4 PyLong_As*() functions at the same time. Don't you think so?

    The PyLong_As* functions assume that intmax_t and uintmax_t have no padding bits, no trap representation, and (in the case of intmax_t) use two's complement. I think it's fine to assume all these things, but we should also either document or test those assumptions.

    What is a "trap representation"?

    I only know "two's complement". What are the other kinds?

    How should we test those assumptions?

    The patch lacks tests.

    Which kind of test do you see?

    Would you like to help me to implement this new feature?

    @vstinner
    Copy link
    Member

    Oh, the sqlite3 module has an interesting function:

    PyObject *
    _pysqlite_long_from_int64(sqlite_int64 value)
    {
    #ifdef HAVE_LONG_LONG
    # if SIZEOF_LONG_LONG < 8
        if (value > PY_LLONG_MAX || value < PY_LLONG_MIN) {
            return _PyLong_FromByteArray(&value, sizeof(value),
                                         IS_LITTLE_ENDIAN, 1 /* signed */);
        }
    # endif
    # if SIZEOF_LONG < SIZEOF_LONG_LONG
        if (value > LONG_MAX || value < LONG_MIN)
            return PyLong_FromLongLong(value);
    # endif
    #else
    # if SIZEOF_LONG < 8
        if (value > LONG_MAX || value < LONG_MIN) {
            return _PyLong_FromByteArray(&value, sizeof(value),
                                         IS_LITTLE_ENDIAN, 1 /* signed */);
        }
    # endif
    #endif
        return PyLong_FromLong(value);
    }

    If PyLong_FromIntMax_t() is implemented, this function may be simply removed (and replaced with PyLong_FromIntMax_t).

    @vstinner
    Copy link
    Member

    vstinner commented Jun 5, 2013

    Another place where PyLong_FromIntMax_t() would help, _elementtree.c:

        id = PyLong_FromSsize_t((Py_uintptr_t) self);

    selectmodule.c defines PyLong_AsUintptr_t() with:

    #if (SIZEOF_UINTPTR_T != SIZEOF_VOID_P)
    #   error uintptr_t does not match void *!
    #elif (SIZEOF_UINTPTR_T == SIZEOF_LONG_LONG)
    #   define T_UINTPTRT         T_ULONGLONG
    #   define T_INTPTRT          T_LONGLONG
    #   define PyLong_AsUintptr_t PyLong_AsUnsignedLongLong
    #   define UINTPTRT_FMT_UNIT  "K"
    #   define INTPTRT_FMT_UNIT   "L"
    #elif (SIZEOF_UINTPTR_T == SIZEOF_LONG)
    #   define T_UINTPTRT         T_ULONG
    #   define T_INTPTRT          T_LONG
    #   define PyLong_AsUintptr_t PyLong_AsUnsignedLong
    #   define UINTPTRT_FMT_UNIT  "k"
    #   define INTPTRT_FMT_UNIT   "l"
    #elif (SIZEOF_UINTPTR_T == SIZEOF_INT)
    #   define T_UINTPTRT         T_UINT
    #   define T_INTPTRT          T_INT
    #   define PyLong_AsUintptr_t PyLong_AsUnsignedLong
    #   define UINTPTRT_FMT_UNIT  "I"
    #   define INTPTRT_FMT_UNIT   "i"
    #else
    #   error uintptr_t does not match int, long, or long long!
    #endif

    @serhiy-storchaka
    Copy link
    Member

    Another place where PyLong_FromIntMax_t() would help, _elementtree.c:

    id = PyLong_FromSsize_t((Py_uintptr_t) self);
    

    I think PyLong_FromVoidPtr() should be used here.

    @vstinner
    Copy link
    Member

    vstinner commented Jul 1, 2013

    Updated patch (version 3), addressing last issues of Mark's review:

    • add a simple unit test in _testcapi: check that PyLong_AsIntMax_t(INTMAX_INT - 1), PyLong_AsIntMax_t(INTMAX_MAX + 1) and PyLong_AsUintMax_t(UINTMAX_MAX + 1) fail with OverflowError
    • rely on stdint.h on Windows, with a fallback on __int64 for Visual Studio 2008
    • PyLong_AsIntMax_t() and PyLong_AsUintMax_t() compute directly the result (rather than calling _PyLong_AsByteArray)

    I give you one or two days for a last review, and then I'm going to commit the new functions.

    @mdickinson
    Copy link
    Member

    Yes, I apologise; I haven't had time for review. I'll unassign so that someone else can pick this up. It would still be good to have an independent review from someone before this goes in, though.

    @mdickinson mdickinson removed their assignment Jul 2, 2013
    @pitrou
    Copy link
    Member

    pitrou commented Oct 16, 2014

    Victor, have you seen https://code.google.com/p/msinttypes/ ?

    @vstinner
    Copy link
    Member

    @vstinner
    Copy link
    Member

    New attempt to support C intmax_t type, since Python 3.6 now supports a subset of C99: intmax_t-4.patch. Changes:

    Add support for C intmax_t type:

    Add new conversions functions for PyLong:
        PyLong_FromIntMax()
        PyLong_FromUIntMax()
        PyLong_AsIntMax()
        PyLong_AsIntMaxAndOverflow()
        PyLong_AsUIntMax()
    
    getargs: add 'm' format
    
    New _testcapi constants: INTMAX_MAX, INTMAX_MIN, UINTMAX_MAX, SIZEOF_INTMAX_T
    
    Add _testcapi.getargs_m() and _testcapi.test_long_intmax_api()
    
    PyLong_FromVoidPtr() uses PyLong_FromUIntMax()
    
    Use intmax_t in various modules
    

    array, struct, ctypes and memoryview are not modified yet to support intmax_t.

    --

    To ease the review of this large patch, I created a GitHub pull request: #46

    @vstinner
    Copy link
    Member

    intmax_t-4.patch is a single combined patch for Mercurial. To see individual changes, see the GitHub pull request:
    https://github.com/python/cpython/pull/46/commits

    @vstinner
    Copy link
    Member

    I guess that the main question is if all platforms supported by Python do support intmax_t?

    @serhiy-storchaka
    Copy link
    Member

    The patch adds too much code, but there is very little need of new feature. In all cases where PyLong_FromLong() and PyLong_FromLongLong() are used conditionally, PyLong_FromLongLong() can be used unconditionally. PyLong_FromLong() is used only because the support of "long long" was optional and for optimisation.

    Actually there is not supported platform that can't manage with long long and needs intmax_t. We can't test the new feature. While it can be useful in future, I think this is not a time for adding it.

    @vstinner
    Copy link
    Member

    Serhiy Storchaka: "The patch adds too much code, but there is very little need of new feature (...)"

    I wrote my first patch in 2013, but I still fail to find a very good example where intmax_t would be an obvious choice. So I have to agree and I will now close the issue.

    If you still want to get a builtin support for intmax_t in CPython, please come with a strong rationale for justify adding "so much code" ;-) I now close the issue and reject my PR.

    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Jun 14, 2017

    I wrote my first patch in 2013, but I still fail to find a very good example where intmax_t would be an obvious choice. So I have to agree and I will now close the issue.

    Hold on, nobody ever answered the question in the OP. How would you convert an intptr_t (e.g. Rust's int type) to a Python int?

    You can't use FromVoidPtr because of signedness. You can use FromLongLong, but that's implementation-defined.

    If what we should be using is FromLongLong for all "really big ints", why not just rename FromLongLong to FromIntMax and call it a day?

    There is no standard relationship between long long and most other int types -- all we know is that it's at least 64 bits, but an int type can perfectly reasonably be e.g. 80 bits or 128 bits or similar. I think it *is* a worhtwhile goal to allow programmers to write C code that has as little implementation-defined or undefined behavior as possible.

    If that isn't considered a worthwhile goal, maybe we should reconsider using such a dangerous and pointy language as C. :)

    @vstinner
    Copy link
    Member

    Hold on, nobody ever answered the question in the OP. How would you convert an intptr_t (e.g. Rust's int type) to a Python int?

    Write your own C extension to do that. Sorry, I don't know what is the best way to write such C extension.

    Maybe look at https://github.com/PyO3/PyO3 ? cc Yury

    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Jun 15, 2017

    Write your own C extension to do that. Sorry, I don't know what is the best way to write such C extension.

    If everyone who wants to convert intptr_t to a python int has to write their own function, then why not just include it in the C-API?

    Having support for intmax_t means we never have to have this conversation ever again, because it should work for all int types.

    Reopening since this use-case doesn't sound solved yet.

    @ssbr ssbr mannequin reopened this Jun 15, 2017
    @serhiy-storchaka
    Copy link
    Member

    It may be better to make _PyLong_FromByteArray() and _PyLong_AsByteArray() public.

    @vstinner
    Copy link
    Member

    Me:

    Write your own C extension to do that. Sorry, I don't know what is the best way to write such C extension.

    Devin Jeanpierre:

    If everyone who wants to convert intptr_t to a python int has to write their own function, then why not just include it in the C-API?

    Hum, who else needs such function except of you?

    CPython is maintained by volunteers. The maintenance has a high cost, so we need a strong rationale before adding new features. In this case, it was decided by core developers that the added complexity (amount of code) is not worth it since the use case is too rare.

    Devin:

    Reopening since this use-case doesn't sound solved yet.

    Devin, I asked you for a strong rationale to add the feature. I don't see such rationale, so this issue will be closed again.

    Serhiy:

    It may be better to make _PyLong_FromByteArray() and _PyLong_AsByteArray() public.

    That makes sense. I suggest to open a new issue for that.

    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Jun 15, 2017

    Devin, I asked you for a strong rationale to add the feature. I don't see such rationale, so this issue will be closed again.

    I guess we have different definitions of "strong rationale". Clearer criteria would help.

    > It may be better to make _PyLong_FromByteArray() and _PyLong_AsByteArray() public.
    That makes sense. I suggest to open a new issue for that.

    This request was part of the original bug report, so why open a new issue?

    PyLong_FromIntMax_t(myinteger) would be great. Or maybe even better would be PyLong_FromBytes(&myinteger, sizeof(myinteger)) ?

    @vstinner
    Copy link
    Member

    This request was part of the original bug report, so why open a new issue?

    Making two C functions public is very different from supporting intmax_t. I expect a change of a few lines, whereas my intmax_t patch modified a lot of code.

    I wanted to simplify the C code of CPython, I didn't care of the public support for intmax_t (but I considered that it would be nice to have).

    The thing is that portability is complex, CPython wants to support as much platforms as possible. We only moved to C99 with Python 3.6, before stdint.h was not required. There are always minor compatibility issues, so it's simpler to not try to support an uncommon C type (intmax_t), and rather support a very generic functions "here are bytes, give me a Python int object".

    It would solve your use case as well ;-)

    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Jun 15, 2017

    Oh, to be clear on this last point:

    Hum, who else needs such function except of you?

    Right now there is no way to convert an int that might be > 64 bits, into a python long, except really bizarre shenanigans, unless we want to rely on implementation-defined behavior.

    This would be fine if it were easy to implement, but it isn't -- as we've both agreed, there's no good way to do this, and it is significantly easier to add this to CPython than to implement this from outside of CPython. And I do think there is merit in writing code that doesn't rely on implementation-defined behavior.

    I also think it's simpler -- imagine if we just didn't care about all these int types! Phew.

    Ack that this isn't "strong rationale" per your standards, so do whatever is right for this bug.

    @ssbr
    Copy link
    Mannequin Author

    ssbr mannequin commented Jun 15, 2017

    Making two C functions public is very different from supporting intmax_t. I expect a change of a few lines, whereas my intmax_t patch modified a lot of code.

    I requested either a way to create from intmax_t, or from bytes. We have two existing functions (that I didn't know about) to do the latter, so it would fix this bug report to just make those public, from my POV.

    @ezio-melotti ezio-melotti transferred this issue from another repository Apr 10, 2022
    @iritkatriel iritkatriel added the interpreter-core (Objects, Python, Grammar, and Parser dirs) label Nov 28, 2023
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Labels
    interpreter-core (Objects, Python, Grammar, and Parser dirs) topic-C-API type-feature A feature request or enhancement
    Projects
    None yet
    Development

    No branches or pull requests

    6 participants