This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author jluscher
Recipients cjwelborn, docs@python, eric.smith, jluscher
Date 2015-05-26.13:24:10
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1432646650.61.0.401745099966.issue24281@psf.upfronthosting.co.za>
In-reply-to
Content
Eric,
I am not familiar with the 'g' format and did not know it was the default, but the documentation, read fully, is correct.  It just took the response of Christopher Welborn to "wake me up" (it was a LONG day and I was "struggling" to understand the problem - I would hate to tell you how long I worked on this problem !  {embarrassment!!}).

My programming usually involves text manipulation, data structures and GUIs.  The am familiar with the use of 'precision' as length of the decimal fraction but was thrown by (the equally valid) use of 'precision' as the mathematical idea referring to 'significant digits'. Counting BOTH sides of decimal point was a difficult switch of word meaning to get my head around.

Just because "I" have never(!) used the 'g' format doesn't mean it isn't the proper choice for 'default'.  Language is often a challenge and 'precision' with two meanings (# of digits to right of decimal AND total # of digits {right and left of decimal)) does present a linguistic challenge to one's ability to comprehend what the word means ;-)

I wish I could give you the solution but at least I (finally !) understand the mental issue involved.

Thanks to both of you for your help getting me past my 'mental fugue' ;-)
History
Date User Action Args
2015-05-26 13:24:10jluschersetrecipients: + jluscher, eric.smith, docs@python, cjwelborn
2015-05-26 13:24:10jluschersetmessageid: <1432646650.61.0.401745099966.issue24281@psf.upfronthosting.co.za>
2015-05-26 13:24:10jluscherlinkissue24281 messages
2015-05-26 13:24:10jluschercreate