Message91304
On the specific point of:
> 2.1 some languages/alphabets use other chars (e.g. a comma or other
> symbols) instead of the decimal point.
I think it's not the job of the float() constructor to support it.
Depending on the country, the comma has different meanings when put in a
number (thousands separator or decimal separator). Ditto for the point,
but using a point as decimal separator is the accepted standard for
non-localized computer I/O.
More generally, I think the fact that int(), float() et al. support
non-ASCII decimal digits should be seen as a convenience rather than a
willingness to accomodate the broadest set possible of inputs. Which
means, we should only add support for new formats only if it's sensible,
safe and non-ambiguous to do so.
I also agree with Marc-André's argument that the Unicode spec should be
a good guide here. |
|
Date |
User |
Action |
Args |
2009-08-05 10:14:18 | pitrou | set | recipients:
+ pitrou, lemburg, loewis, mark.dickinson, ggenellina, eric.smith, ezio.melotti |
2009-08-05 10:14:18 | pitrou | set | messageid: <1249467258.52.0.434004376706.issue6632@psf.upfronthosting.co.za> |
2009-08-05 10:14:17 | pitrou | link | issue6632 messages |
2009-08-05 10:14:16 | pitrou | create | |
|