Author rhettinger
Recipients mark.dickinson, rhettinger, steven.daprano
Date 2021-11-23.15:55:50
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
> I'm not sure this is worth worrying about ...

Instead of writing simple, mostly accurate code with math.fsum(), these functions have already applied labor intensive measures to get an exact mean and exact sum of square differences expressed in fractions.  Then at the final step before the square root, it prematurely rounds to a float.  This is easy to fix and has a single step cost comparable to that already paid for each input datum.

In a sports analogy, we've run the ball almost the full length of the field and then failed to put the ball over the goal line.

Part of the module's value proposition is that it strives to be more accurate than obvious implementations.  The mean, variance, and pvariance function are correctly rounded.  In this regard, the stdev and pstdev functions are deficient, but they could easily be made to be almost always correctly rounded.

> One thought: would the deci_sqrt approach help with value ranges 
> where the values are well within float limits, but the squares
> of the values are not?

Yes, the Emin and Emax for the default context is already almost big enough:

    Context(prec=28, rounding=ROUND_HALF_EVEN, Emin=-999999, 
    Emax=999999, capitals=1, clamp=0, flags=[], 
    traps=[InvalidOperation, DivisionByZero, Overflow])

We could bump that up to fully envelop operations on the sum of squares:

    Context(Emin=-9_999_999, Emax=9_999_999, prec=50)
Date User Action Args
2021-11-23 15:55:50rhettingersetrecipients: + rhettinger, mark.dickinson, steven.daprano
2021-11-23 15:55:50rhettingersetmessageid: <>
2021-11-23 15:55:50rhettingerlinkissue45876 messages
2021-11-23 15:55:50rhettingercreate