That is rich coming from a former NSA Tailored Access Operations agent. She had no problems paying companies to release insecure software, including some that have signed the "secure by design" pledge.
Pretty sure I'm going to burn some karma on this one but what the hell.
To the best of my knowledge there is no evidence in over four decades of commercial software development that supports the assertion that software can be truly secure. So to my mind this suggests the primary villains are the individuals and organizations that have pushed software into increasingly sensitive areas of our lives and vital institutions.
I think sometimes cyber is still seen as an unnecessary cost. Plenty of places do bare minimum for security, and most of the time its after an incident that budgets suddenly get raised.
Software, hardware, policy, and employee training are all things one must focus on. You can't just start making rdx or fireworks without the proper paperwork, permits, licenses, fees, and a lawyer around to navigate everything. You run a business without investing anything into IT and cybersecurity, you just make it easier for an incident to occur. And remember, just because your product isn't IT or cyber security dosnt mean its losing money, it a cost of doing buissness in our regulated market. You mishandle HIPPA, PII or sensitive info, and the customers realize you didn't take basic steps to stop this, you open yourself to a lawsuit. Think about it like this, investing in it every day means your lowering that risk, however much you think is reasonable to pay for, and every day its paying for itself.
Dan Geer on prioritizing MTRR over MTBF (2022):
Metrics as Policy Driver: Do we steer by Mean Time Between Failure (MTBF) or Mean Time To Repair (MTTR) in Cybersecurity?
Choosing Mean Time Between Failure (MTBF) as the core driver of cybersecurity assumes that vulnerabilities are sparse, not dense. If they are sparse, then the treasure spent finding them is well-spent so long as we are not deploying new vulnerabilities faster than we are eliminating old ones. If they are dense, then any treasure spent finding them is more than wasted; it is disinformation.
Suppose we cannot answer whether vulnerabilities are sparse or dense. In that case, a Mean Time To Repair (MTTR) of zero (instant recovery) is more consistent with planning for maximal damage scenarios. The lesson under these circumstances is that the paramount security engineering design goal becomes no silent failure – not no failure but no silent failure – one cannot mitigate what one does not recognize is happening.
I don't quite agree, but I do somewhat agree.
We need to professionalize and actually accept liability for our work. [1]
[1]: https://gavinhoward.com/2024/06/a-plan-for-professionalism/
Using the same logic, one can argue "SOFTWARE IS PROVIDED AS IS". It should be up-to the user to choose the correct software based on their security policy.
I write software for fun and skillz, making computers do extraordinary things. If I start following regulation, then there is no fun for me and no software that does extraordinary things.
No ma'am Doom or Second Reality would not have been possible with this attitude.
Why does software require so many urgent patches?
Conspiracy theory: creating new bugs they can always fix later is a good source of continued employment.
Of course there's also the counterargument that insecurity is freedom: if it weren't for some insecurity, the population would be digitally enslaved even more by companies who prioritise their own interests. Stallman's infamous "Right to Read" is a good reminder of that dystopia. This also ties in with right-to-repair.
The optimum amount of cybercrime is nonzero.
I used to be an IT guy at a structural and civil engineering firm. Those were real professional engineers with stamps and liability.
As long as "SWEs" do not have stamps and legal liability, they are not real (professional) engineers, IMHO.
My point is that I believe to earn the title of "Software Engineer," you should have a stamp and legal liability.
We done effed up. This breach of standards might be the great filter.
edit: Thanks to the conversation down-thread, the possibly obvious solution is a Software Professional Engineer, with a stamp. This means full-stack is actually full effing stack, not any bullshit. This means that ~1% to ~5% of SWE would be SWPE, as it is in other engineering domains. A SWPE would need to sign off on anything actually important. What is important? Well we figured that out in other engineering domains. It's time for software to catch the f up.
Where does CISA/NIST recommend (for software developers) or require (for government agencies integrating software) specific software/operating system hardening controls?
* Where do they require software developers to provide and enforce seccomp-bfp rules to ensure software is sandboxed from making syscalls it doesn't need to? For example, where is the standard that says software should be restricted from using the 'ptrace' syscall on Linux if the software is not in the category of [debugging tool, reverse engineering tool, ...]?
* Where do they require government agencies using Kubernetes to use a "restricted" pod security standard? Or what configuration do they require or recommend for systemd units to sandbox services? Better yet, how much government funding is spent on sharing improved application hardening configuration upstream to open source projects that the government then relies upon (either directly or indirectly via their SaaS/PaaS suppliers)?
* Where do they provide a recommended Kconfig for compiling a Linux kernel with recommended hardening configuration applied?
* Where do they require reproducible software builds and what distributed ledger (or even central database) do they point people to for cryptographic checksums from multiple independent parties confirming they all reproduced the build exactly?
* Where do they require source code repositories being built to have 100% inspectable, explainable and reproducible data? As xz-utils showed, how would a software developer need to show that test images, test archives, magic constants and other binary data in a source code repository came to be and are not hiding something nefarious up the sleeve.
* Where do they require proprietary software suppliers to have source code repositories kept in escrow with another company/organisation which can reproduce software builds, making supply chain hacks harder to accomplish?
* ... (similar for SaaS, PaaS, proprietary software, Android, iOS, Windows, etc)
All that the Application Security and Development STIG Ver 6 Rel 1[1] and NIST SP 800-53 Rev 5[2] offer up is vague statements of "Application hardening should be considered" which results in approximately nothing being done.
[1] https://dl.dod.cyber.mil/wp-content/uploads/stigs/zip/U_ASD_...
[2] https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.S...
I dunno. "Evil Ferret" and "Scrawny Nuisance" sound pretty good in our irony filled world.
So in a hypothetical world were the paranoid reign supreme and all software ia safe and unuseable cause usage is not a protection goal, do they declare a revolution in the name of useability and economic speed to overthrow the evil protectors?
I hope she starts the crackdown with easily the biggest impact offender here, Microsoft
Was about half way through before I realised that the article was not, in fact, satirical. Half-expected to see harddrive in the URL.
What a joke. This role deserves way better, but I understand it's only been around since 2018.
I strongly disagree.
If someone puts cyanide in the coffee pot, we don't blame the engineer that designed the coffee pot for not making it cyanide proof.
Criminals are the criminals, not a developer that didn't code defensively enough. The fact that a government official is blaming developers for crimes they don't commit is fascist level rhetoric.
A previous head of cyber security was fired when he said something like that.
You won't earn a prize for most secure, fewest bugs or longest uptime in our industry.
Days without incident is not a metric the software industry cares about, because it doesn't matter.
Our customers are vendor-locked in and because we have a market monopoly they can't do anything than accepting our conditions.
If only the state would regulate our industry, but that won't happen, because we will call the regulator a communist and then every regulation will be deleted from the agenda.
>"We don't have a cyber security problem – we have a software quality problem. We don't need more security products – we need more secure products."
Uhmmm. The foundation of a lot of the modern economy is built on Windows and the Crowdstrike fiasco has shown, Windows requires a security software to save it from itself by running at the kernel level. If we truly want secure products, we should shutdown all Windows machines?
What she is asking for is a radical economic restructuring of a free market.
It's attitudes like hers that lead to the federal government having the worst software imaginable. It just doesn't work unless everyone agrees to do it .. so good luck
"Technology vendors are the characters who are building problems..."
there are vendors building problems into their products? isn't that a crime?
What a silly take.
"House developers that build weak doors into houses are the real problem, not the burglars"
Security is an age-old problem, it is not a new concept. What is different with information security is the complexities and power dynamics changed drastically.
I mean, really! She should know better, the #1 attack vector for initial access is still phishing or social-engineering of some kind. Not a specific vulnerability in some software.
At this point, I have to wonder what is even the point of missives like this. There are only two things that will solve the software quality problem:
1. Economic incentives. It's all just mindless blather unless you're actually talking about ways that software vendors will be held liable for bugs in their products. If you're not talking about that, what you're saying is basically "ok pretty please" useless.
2. Reducing the complexity of making products secure in the first place. Making truly secure software products is incredibly hard in this day and age, which is one reason why demanding software product liability is so scary. Professional structural engineers, for example, are used to taking liability for their designs and buildings. But with software security the complexity is nearly infinitely higher, and making it secure is much harder to guarantee.
The other thing that people often ignore, or at least don't want to admit, is that the "move fast and break things" ethos has been phenomenally successful from a business perspective. The US software industry grew exponentially faster than anyplace else in the world, even places like India that doubled down on things like the "Software Capability Maturity Model" in the early 00s, and honestly have little to show for it.