This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author tim.peters
Recipients Jeffrey.Kintscher, mark.dickinson, pablogsal, rhettinger, tim.peters, veky
Date 2020-08-09.17:02:05
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1596992525.72.0.853945798024.issue41458@roundup.psfhosted.org>
In-reply-to
Content
More extensive testing convinces me that pairing multiplication is no real help at all - the error distributions appear statistically indistinguishable from left-to-right multiplication.

I believe this has to do with the "condition numbers" of fp addition and multiplication, which are poor for fp addition and good for fp multiplication.  Intuitively, fp addition systematically loses mounds of information whenever two addends in different binades are added (the lower bits in the addend in the binade closer to 0 are entirely lost).  But the accuracy of fp mult couldn't care which less which binades the inputs are in, provided only the result doesn't overflow or become subnormal.

For "random" vectors, pairing summation tends to keep addends close together in magnitude, which is "the real" reason it helps.  Left-to-right summation tends to make the difference in magnitude increase as it goes along (the running sum keeps getting bigger & bigger).

So, red herring.  The one thing that _can_ be done more-or-less straightforwardly and cheaply for fp mult is to prevent spurious overflow and underflow (including spurious trips into subnormal-land).
History
Date User Action Args
2020-08-09 17:02:05tim.peterssetrecipients: + tim.peters, rhettinger, mark.dickinson, veky, pablogsal, Jeffrey.Kintscher
2020-08-09 17:02:05tim.peterssetmessageid: <1596992525.72.0.853945798024.issue41458@roundup.psfhosted.org>
2020-08-09 17:02:05tim.peterslinkissue41458 messages
2020-08-09 17:02:05tim.peterscreate