This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author rhettinger
Recipients mark.dickinson, rhettinger, serhiy.storchaka, tim.peters
Date 2018-08-10.20:40:32
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1533933632.32.0.56676864532.issue34376@psf.upfronthosting.co.za>
In-reply-to
Content
> What results for all components equal? hypot(1.0, 1.0, ..., 1.0)

The scaled summation will be exact (all elements scale to 1.0 and frac is always 0.0).

> Would it give a performance benefit if get rid of multiplication 
> and division, and scale by the power of two approximation of the 
> max using ldexp()?

You're right that the multiplication and division are the most expensive part (the adds and subtracts are cheaper and can be done in parallel with subsequent memory fetches).  See the attached Clang and GCC-8 disassemblies.

I've tried a number of variants and couldn't get any performance boost without breaking some of the test cases.  This patch is the best of a dozen attempts.
History
Date User Action Args
2018-08-10 20:40:32rhettingersetrecipients: + rhettinger, tim.peters, mark.dickinson, serhiy.storchaka
2018-08-10 20:40:32rhettingersetmessageid: <1533933632.32.0.56676864532.issue34376@psf.upfronthosting.co.za>
2018-08-10 20:40:32rhettingerlinkissue34376 messages
2018-08-10 20:40:32rhettingercreate