Mark Zuckerberg Announces the "Beginning of the End" for Programmers

  • I don't understand why so many people subscribe to this "prediction". It seems unsubstantiated hyperbole to me.

    There are a few reasons why I don't believe AI will replace programmers anytime soon:

    1. The job of a developer/engineer entails so much more than writing code. Figuring out what the business wants, turning that into a good (system) design, etc. takes up more time than the actual coding itself. Unless of course you take "programmer" literally, but I have not seen many companies that still hire programmers in the most narrow sense, that only focus on writing code.

    2. Support and maintenance is a huge part of the job that I don't see AI doing. Theoretically you could let humans focus on that part, but I believe support and maintenence will become much more costly if the people doing they job have no familiarity with the code because they didn't write it.

    3. As evidenced by many comments in the thread elsewhere on HN about the announcement of Claude Sonnet 3.7 AI still routinely makes mistakes that are super easy to spot and verify. As long as that remains the case, it's going to be detrimental to the success of you company if you give AI too much autonomy.

    I know people will argue that AI is evolving so fast that the above will be solved soon. But I think all three aspects I mentioned are such fundamental roadblocks that they won't be solved soon.

    What I do believe in is engineers becoming so much more productive as AI evolves.

  • If everyone is a programmer / coder since they have an AI software engineer on hand, I'm hoping that they would be comfortable with long term maintenance.

    As entropy marches on with more AI generated lines of code in the codebase and software, APIs, tooling have breaking changes, will these new class of "vibe coder" / "creator coder" have the means and time to maintain their massive codebase?

    I think AI is good for MVP's but if we're talking 10-30M lines of code then it might not be the best tool for this.

  • The hard part is the customer, not the technology. Unless you are working on something very unusual, it should be straightforward to implement anything given perfect requirements.

    Much (most?) of my time as a software engineer has been spent poking absurd holes in customer stories such that they are compelled to provide the actual requirements. This edge case probing is what LLMs are infamously bad at. They are too eager to please. There's not an inner asshole with an aggressive aesthetic preference that was built up over months of interchange with the client.

    The constant here is "agency". LLMs inherently lack it. So, it has to come from somewhere. How many layers of abstraction do we need to put in between the will of the customer and the product they paid for?

    I think a viable solution could be to use the LLM as a direct bridge between your product and the customer. Tool calling with these new reasoning models is a hell of a drug. It's not that difficult to just write this code. 99% of it is string interpolation. You don't need copilot for this.

  • Maximum AI bubble hype.

    This is all about suppressing wages, laying off American engineers, and rationalizing many tens of billions wasted on building AI infrastructure no one needed and no one will use.

  • This guy should focus more on fixing the ai generated plague that is currently waving his social media network, but instead seems “not to care too much” as long it keeps users busy.

  • The solution to every problem in programming is another rlayerbof abstraction.

    For me programming was always about expressing my intend.

    I don’t think about the instructions the compiler generates. I also rarely think about the expanded form of a template expression.

    If ai just acts as an it remediate between me and the compiler by adding jet-another- abstraction between me and the generated instruction, why should I care?

    I will still have to somehow explain the machine what it is that I want.

  • What should I focus on from now on? If I want to change career path, what will pay me as good as a Software Engineer given that I’m 34 years old? Let’s I can take a break of 4 years to take another degree, what would be the wisest choice?

    I’m at lost honestly. If not 2025, it would 2030 or 2040. I fucking love software engineering.

  • It feels like the only thing AI doesn’t have on us (yet) is the ability to drill into legacy code bases. Of course those code bases were written by humans during a time when coding was more expensive because we didn’t have AI to do it for us.

    Because of that, I wonder if legacy code bases will be less common in the future.

    The only prediction I’m confident in is that it’s a bleak future for devs whose skillset consists of languages rather than interests. I’m one of those devs.

  • Seeing the progress in LLMs... I do believe it. One software engineer will do in the future what would take an entire team in the past.

    Now what to do? I have just finished my undergrad in software engineering and got admitted to Masters, but I feel that's a mistake. At the same time, I never knew what else to do in my life but programming.

  • Market yourself as a developer that untangles the mess AI generates.

  • > leaving only a small number of highly specialized positions available.

    Stupid question: how do you become a high level programmer if entry and mid level roles disappear?

  • Well AI could create machine code and not bother with languages. Then we can say programming is ended. Can't see that on any horizon.

  • From what I've experienced I have to agree.

    "However, this transition presents a paradox: who will oversee and correct AI-generated code? Even the most advanced AI models are prone to errors, necessitating human oversight to ensure reliability and security."

    I see a new role for programmers. The ex-coders will oversee quality control and step in as needed in the future.

    Programmers will probably have a few more years -less than 10yrs- but long term their role will radically change.

  • larry elon and bill are all on cool aura quests can we petition to get zuck to clean up bots before anything else?

  • Facebook is the f in FAANG paying at the top of range and he is carrying a grudge about the salaries.

  • Another metaversish prediction.

    Someone proposed this nice project AI will fail at:

    https://news.ycombinator.com/item?id=43120035

  • The programmers can become copyright lawyers and sue Meta into oblivion. In the EU they can block Meta.

    But how would Zuckerberg know, he has never written anything special.

  • He probably meant enshittification industry employed programmers.

  • [dead]