This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author petr.viktorin
Recipients cstratak, petr.viktorin, vstinner
Date 2020-02-27.15:49:10
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1582818550.79.0.353165468311.issue39689@roundup.psfhosted.org>
In-reply-to
Content
The call:
    struct.unpack('>?', b'\xf0')
means to unpack a "native bool", i.e. native size and alignment. Internally, this does:

    static PyObject *
    nu_bool(const char *p, const formatdef *f)
    {
        _Bool x;
        memcpy((char *)&x, p, sizeof x);
        return PyBool_FromLong(x != 0);
    }

i.e., copies "sizeof x" (1 byte) of memory to a temporary buffer x, and then treats that as _Bool.

While I don't have access to the C standard, I believe it says that assignment of a true value to _Bool can coerce to a unique "true" value. It seems that if a char doesn't have the exact bit pattern for true or false, casting to _Bool is undefined behavior. Is that correct?

Clang 10 on s390x seems to take advantage of this: it probably only looks at the last bit(s) so a _Bool with a bit pattern of 0xf0 turns out false.
But the tests assume that 0xf0 should unpack to True.
History
Date User Action Args
2020-02-27 15:49:10petr.viktorinsetrecipients: + petr.viktorin, vstinner, cstratak
2020-02-27 15:49:10petr.viktorinsetmessageid: <1582818550.79.0.353165468311.issue39689@roundup.psfhosted.org>
2020-02-27 15:49:10petr.viktorinlinkissue39689 messages
2020-02-27 15:49:10petr.viktorincreate