“The development of new product lines for use in service of critical infrastructure or [national critical functions] NCFs in a memory-unsafe language (e.g., C or C++) where there are readily available alternative memory-safe languages that could be used is dangerous and significantly elevates risk to national security, national economic security, and national public health and safety.”
Now that's a strong statement.
But it's real. There are so many state actors doing cyberattacks now that everything needs to be much tougher. Otherwise, someday soon much of the world stops working.
Cool. Is this going to require phasing out systems written in C/C++ with horrible security track records like Linux and Windows? Or are they going to get a "too critical to be improved" exemption?
> "Companies have until January 1, 2026, to create memory safety roadmaps."
This doesn't bode well for open source software not backed by a "company" that can write these roadmaps and deliver on them.
aka Sounds like Microsoft, Oracle, and other's lobbying has been effective.
FOSS, en mass, probably doesn't do MISRA or consistent testing, relies on random people, may not sign code or artifacts, and could take a hobby/complacency attitude. For software deemed "critical", the feds are free to donate money and assistance to help critical projects formalize and improve themselves rather than proclaiming unfunded mandates on volunteers.
Surely there is going to be an enormous list of exemptions submitted and approved immediately.
My quick skim did not make this clear: is this for software only or would hardware appliances also count? Routers, modems, PLCs used in gas centrifuges, etc. are just as attractive for exploitation.
CISA guidance: https://news.ycombinator.com/item?id=41863640
How would any software be FIPS compliant? Is there a "memory-safe" implementation of TLS that is also FIPS certified?
This misses the point. C/C++ are unsafe because that’s how implementations happen to work today.
C/C++ can be memory safe. Fil-C/C++ is a good example. It’s not a new language, just a different way of implementing it.
Here’s more info about Fil-C: https://github.com/pizlonator/llvm-project-deluge/blob/delug...
While security is a legitimate concern this article leaves an impression of paid for piece to scare us into paying more money to security consultants and / or bend over to big vendors.
that means you have to use rust for system level programming then? there is really no other alternative at system programming as far as memory safe is concerned, that uses no GC or VM.
There's go to be billions of loc of critical C/C++ software left. By 2026? Doesn't sound realistic.
This seems somewhat incoherent and is too focused on shallow claims about languages instead of trying to understand why the memory bugs happened in the first place.
Are unsafe code blocks in Rust or C# okay? Presumably yes if there are good reasons to do so, sometimes it is necessary. But then as a matter of policy, why is Rust meaningfully different than something like using Valgrind with C++? Of course there are substantive differences from a developer's perspective. But just as a stressed or cynical C++ developer might give up on solving the Valgrind error, a similar Rust developer might give up fighting the borrow checker and add "unsafe" to their buggy code. A federal impetus to switch from C++ to Rust would seem to incentivize this laziness further.
To be clear this isn't a criticism of Rust's design or implementation - demarcated blocks of unsafe code is pragmatic and sensible. The problem is how humans build software. In this sense I don't think we've really settled whether "rewrite the code in Rust" is actually safer than "redo our technical management to include automated memcheck testing and paired code reviews." At the very least, I don't think the latter is insufficient, and the feds are being too heavy-handed by making this about language recommendations.
[If it were up to me I would rewrite it in Rust! Saying "the feds made me" is an excellent excuse :) But I don't like the feds making such strong recommendations/demands when I feel the facts are still quite murky. There simply haven't been enough case studies.]
I also think the feds here (along with techies in general) are undervaluing formal specifications and underestimating the risk of undefined behavior.[1] Rust is very stable but it's not formally specified and until recently had known bugs in its very design, not merely in the rustc implementation. (I think those bugs finally got fixed this year.) Considering how cutting-edge Rust is I am sure there are other "theory bugs" somewhere. The point is that critical software also needs stability, and it is unwise to chase memory safety without considering the risks of being tied to an old version of a compiler, especially with unsafe code.
Again: not saying that Rust is automatically bad because it isn't formally specified. But these issues should at least get lip service.
[1] E.g. this fairly detailed document doesn't discuss this at all: https://www.cisa.gov/sites/default/files/2023-12/The-Case-fo...
YAY!!! are we gonna have more formal verification???? woohoo!!
oh its about memory safety.
So... F35?
I mean.. the last three critical nation wide software failures had nothing to do with memory safety.. but okay. Shouldn't we base recommendations on actual experience?
All the memory safety in the world can't save you from a dumb vendor just screwing millions of computers at once.
Just wait five days and this will all go away.
CISA is stupid. Logic bugs don't go away with Rust.
C++ is only "memory-unsafe" if you are hiring bottom of the barrel talent.
Likely the same kind of folks for which we had to change car manuals from including schematics and repair instructions to including warnings about not drinking the coolant...
Leave it to politicians to pit bull a language. Model checked C/C++ is memory safe. Had they reached out to a wider set of people for guidance, they'd have a more balanced report.
I will agree that software safety -- not just memory safety -- is critical. Trying to attack this at the language level instead of the development process and assurance level is daft. FIPS certification and aerospace certification both require auditing already. It's not much of a stretch to require an audit of critical infrastructure to verify that safety processes are in place.
Simply adopting a different language won't make software safe. It will make it safer, perhaps, but we can do better. Model checked code -- be it written in Rust, C, or C++ -- is on the same level. Tools exist for each. That is what CISA should focus on, not trying to force organizations to migrate their code bases to some new and shiny language.