What “working” means in the era of AI apps

  • Misleading title. Has nothing to say about working, i.e paid employment, with AI apps.

    The main claim in the post: Their portfolio companies have shown an improved rate of accumulating revenue ever since LLMs took off.

    Weakest part of the post: No attempt at explaining how or why a LLM affects these numbers. They allude to 'shipping speed' and 'product iteration', but how an LLM helps these functions is left unexplored.

    There's an implied deductive argument that a LLM can write some code, so obviously shipping speed is faster, so obviously revenue is faster. But the argument is never explored for magnitude of effect or defended against examples where shipping faster or using LLMs doesn't equal faster revenue.

    Also, nothing about sampling bias, size or spread.

    Overall: Probably meant as a confidence boost to the sleep-deprived founders out there. But teaches nothing.

  • I can’t speak to the world of startups or venture capital, I’m way too far from that ecosystem, but I’d like to add a perspective from the sidelines.

    What stands out to me right now is just how loud the expectations around AI have become, especially among non-technical folks. It’s not just “Bitcoin hype” loud, it’s bordering on “AI will solve everything” levels of noise. For those of us who’ve been around a bit longer (sorry, younger HN crowd), the current buzz feels reminiscent of Y2K or the first dot-com wave.

    Back then, I was early in my career, but I vividly remember the headlines, the overpromises, and the sheer volume of attention. The difference now is, there’s a lot more substance under the surface. The tools are genuinely useful, and the adoption curve feels more practical, even inevitable. That’s what makes me think AI might become to this era what the smartphone was to the last, not just a novelty, but an everyday dependency.

    That said, I’ve also learned a lot from voices here on HN, especially when it comes to the financial realities behind the tech. If there’s one throughline in many of these discussions, it’s that financial viability, not just hype or innovation, is what ultimately determines whether this all collapses or truly transforms the world.

    Just my 2 cents.

  • The article lists the many ways in which the bar for success - the minimum "table stakes" that you have to achieve in order to be considered success - have drastically risen, and then concludes with:

    > we believe there’s never been a better time to build an application-layer software company.

    Nothing could be a clearer indication that the primary desirable quality in a founder is the conviction that, against all odds, you are better than everyone else.

  • I don't take anything that comes from a16z seriously, specially since their crypto craze

  • This hides one major caveat in AI which is, contrasting to "old" tech, your OpEx on compute doesn't scale the same way your engineered app traditionally did.

    In other terms, as your revenue scales, your OpEx scales too. This breaks the idea that you need to grow revenue to "break off" as your margin as set due to compute.

    The other issue is, I've been burning compute between Perplexity, Grok, Gemini, Claude and Deepseek. I pay nothing for these and they are good enough. It is easy to grow revenue to $1m when you are burning $2m of compute.

  • Anyone else a bit confused by the use of the word "working" used, given the content of the post? I thought this was going to be about how white collar work is changing, not about fundraising and growth strategies.

  • This seems like a case of selection bias, where they are looking at all the Gen AI startups and seeing that they are making revenue faster than previous startups. But Gen AI startups have mostly only started very recently, so it's obvious that all the successes must have grown fast, as they haven't been around long enough to grow slowly. Maybe in 5 years, we'll see a lot of cases of successful startups that took a slower growth trajectory instead.

    But whether it's short-sighted for the investors or not, I think the takeaway for founders is "investors now expect you to make more revenue faster, and B2C applications are more interesting than before".

  • Come back to this comment in 5 years. Everyone's that's fully bought into the AI hype is on serious crack. This is not my first (or 2nd) rodeo.

  • I think this has an interesting consequence for founders and employees at startups in this day and age, and I'm not quite sure how I feel about it.

    On one hand, it means you can "fail faster". That is, if you're a startup employee, and you don't see "hockey stick" growth that is looking crazy impressive at the end of year 1, you should know that the chances of your equity being worth more than a token are basically zero. Starting around the dot com boom, I worked in numerous startups, and for some of them we were still chugging along in years 3-4 with the hopes that our "semi-OK, decent growth" would turn vertical any day now. I've seen numerous startups that started in the 2015-2020 timeframe (so existed for 5-10 years) where they didn't outright fail but common got wiped in an acquisition. That's more a consequence of the rise in interest rates and difficult fundraising environment, but it's really rough to plug along at a company for 5-10 years, think you're doing OK, and then your stock is worth nothing. So from a startup founder/employee perspective, you get signal faster and don't have to waste time.

    Simultaneously, though, it seems like any idea that would take a decent amount of upfront investment and time would be hella difficult to get funded, and I think that's unfortunate.

  • When tulips suddenly became fashionable a few years back, articles like this were rife.

    This article even smells ... generative.

  • I read TFA. Quickly but I read it. It's short but I have no idea if they were saying "wow AI products are popular" or "AI helps startups reach higher levels of profitability faster" or simply "A company that says they are making an AI driven product receives more initial users and funding".

    Each of those are so wildly different conclusions and require such wildly different data to support it.

  • The metric the article focuses on is “revenue,” but it seems like the foundation many of these startups build on (other people’s LLM APIs) are much more expensive than the last generation of startups.

    Given the cost of training a SOTA model, it’s not clear these companies have sustainable businesses. If your primary expense is AWS you can always shift to your own hardware once you hit sufficient scale. If you’re Cursor, how big do you need to get to eliminate your 3rd party API dependency?

  • It means doing your normal work and then staying late to finish your mandatory use of AI to meet management checkboxes.

  • How does Cursor make 100 million in revenue? Do they add that much markup?

  • Difficult to know the size of the pool of companies they're talking about

  • I tried using Claude Code, was utterly disappointed with using it and it cost me a lot of money very quickly. Just goes to show these tools can augment and improve your workflow and knowledge, but in their current state they cannot replace you - any CEO who says otherwise has no clue and is riding the hype train

  • [dead]

  • [dead]

  • [dead]

  • Nowadays, I don't think work should be defined just by clocking in and clocking out. It’s about the ability to complete tasks and hit goals efficiently. This is how AI will redefine work.