I think recently I had to make my own because strtod was too large for my embedded system. I just wanted simple xx.yy type strings, so it was as easy as parsing 2 integers, convert, divide, add, done. I often wish these functions had more variants for stuff like that.
I hate the anti-responsive design of this page: I zoom in (ctrl-plus in Firefox), and the text gets smaller.
Would be cool to see a histogram of all of the web's textual representation of floats based on character length.
In other words-- what percentage of outstanding publicly-accessible data sets require an implementation of strod which can allocate memory on the heap?
Even this article, which talks about millions of digits, could be parsed just fine with a strod that's limited to 64 characters.
To be clear, David Gay's dtoa is not a modern implementation of decimal-to-float conversion. There had been several much simpler and performant alternatives, including:
- Google's double-conversion [1], which is best known for introducing the Grisu family of new float-to-decimal algorithms but also has a much less documented float-to-decimal algorithm via successive approximations AFAIK.
- The Eisel-Lemire algorithm [2], which is a Grisu3-like algorithm and returns either correct digits or a much rare fallback signal and currently in the standard libraries of Go and Rust.
- I believe Microsoft's own C Runtime (msvcrt, later ucrt) also has a completely separate code which algorithm is roughly similar to one of above.
These implementations also clearly demonstrate that such conversion only needs a bigint support of the bounded size (~3 KB) and can be done in much smaller code than dtoa.
[1] https://github.com/google/double-conversion
[1] https://lemire.me/blog/2020/03/10/fast-float-parsing-in-prac...