Ask HN: Confused about how DeepSeek hurts Nvidia

  • It's a bit like "the Cisco moment" (and lots of people have been observing this). The company was building hardware needed for building out networks. The web looked like it was going to be the next big thing, and people couldn't get enough of CSCO. The web didn't pan out the way people hoped (or as quickly), and CSCO fell quickly.

    Cisco kept making and selling network hardware, and probably (citation needed) sold more from 2000-2006 than 1994-2000, but the stock trade was over. The web did become a serious thing, but only once people got broadband at home.

    The Nvidia valuation was getting pretty weak. Lots of FAANGs with deep pockets started to invest in their own hardware, and it got good enough to start beating Nvidia. Intel and AMD are still out there and under pressure to capture at least some of the market. Then this came along and potentially upended the game, bringing costs down by orders of magnitude. It might not be true, and it might even drive up sales long-term, but for now, but the NVDA trade was always a short-term thing.

  • It doesn't. Inference is still expensive, and demand for it is high, as evidenced by Anthropic's frequent "we're out of quota" messages and Deepseek's crap-out under load last night. On the training side right now only the top flight labs can conduct serious, ambitious research, and even they don't do as much research as they'd like. Witness Meta effectively train the exact same architecture on similar data mixtures for the past couple of years. More or less the same situation is happening across the board - compute bandwidth (and therefore the ability to experiment) is scarce. What this means is inference will remain quite expensive in the foreseeable future, especially multimodal and long-context inference. Believe it or not, even Google is compute constrained. When I was there some days I couldn't even get a handful of TPUs to do my job - everything was allocated to training Gemini. Even if it gets a lot cheaper to train models, you could just train larger, more capable models and do more architectural / efficiency research, and iterate faster, with tremendous payback in the long run. NVIDIA is the only viable seller of shovels for this gold rush for everyone but Google and Anthropic. Bypassing the gatekeepers, and making capable AI models available to more people makes their product more valuable.

  • > DeepSeek won’t be top dog forever.

    I agree with this in the sense that no model will be top dog forever. However, it's important to note their contributions to open source. They're raising the bottom bar, and that is important.

  • I too can't understand why? won't a cheaper model make people use AI more? For example, the current $200 chatgpt plan is too expensive for me, but making it $4, I will become a customer.

    Many small companies, which would never think about training models in house, could now do it.

    I see this will only boost the AI hardware market.

  • Analysing the market and competitive situation:

    Deepseek's cheaper LLM services + providing open models for other hosts to provide

    => overall prices for using LLM services will fall due to competition (lower prices + more hosts entering the market); AI users won't pay so much for LLM services

    => LLM hosts/providers won't be able to project such high revenues or even purchase as many GPUs (and will receive less capital investment to buy GPUs since revenues per dollar invested are lower)

    => demand for and prices of Nvidia cards will fall

    On the basis of this possible logic, portfolio managers and algorithms project lower growth/revenue for Nvidia and sell off its stock, setting off the usual chain reaction as other managers notice the downward price action and follow suit in order to stop further losses.

  • Market reactions go to immediacy usually. You don't see people stonking in or out of something for an outcome in 5 years time: It's immediacy which makes a wave happen.

    So noting your long term investment ideas seem plausible, what do you think is the immediate short term impact on this kind of spend? Do you think Nvidia will sell more or less units in the next reporting interval? Because thats what most people are reacting to.

    It would not surprise me if there are plenty of willing buyers, looking to buy in a dip and sell on the inevitable upward swing.

    I am not a direct investor. I have no idea what my pension fund did, if anything.

  • There are people who are willing to say that deepseek has such a great team -- so great, in fact -- that they could still always crush competition absolutely all the time, despite having open source their models to some extent, if not all.

    I am actually interested to see those who hold this view explain their logic here further. It's quite an interesting, if unorthodox take because we would think that algorithmic brilliance is not a moat, even when the code is not open source.

  • If work can be done with fewer GPUs then people will use less. This is what leading to Nvidia fall.

  • Deepseek is not hardware; how could it hurt NVidia? It's found a way to use train data efficiently, but it seems DeepSeeks brings more potential customers to them. It's open a way for small enterprises and individuals. I have lost faith that AMD will fight back against Nvidia. If Apple or another RISC-V-based LLM accelerator hardware manufacturer doesn't respond to NVIDIA, NVIDIA will drive hardware prices skyrocketing, as any monopoly does. It looks like Apple is the one with the most potential to give the answer that will hurt Nvidia.

  • The market doesn't care about rational, it needed a reason ( for a very long time ) for a correction. The more fear it can drive, the lower it can drive those valuation down.

  • > DeepSeek won’t be top dog forever.

    Which is... worse for Nvidia? If someone else disrupts DeepSeek, do they train a similarly performing model for $600k?

  • I'm not stock expert (but bought 80 shares of Nvidia yesterday). My rationale is, that if DeepSeek lowered barier for entry into AI model development, actually more companies can afford to join. Yes, big dogs may require less cards with efficient algorithms, but to me, more companies = more cards needed.

  • it doesn't hurt Nvidia like a cheaper light bulb wouldn't hurt electricity investing in 1900. On the contrary what it means is that you get more intelligence for your buck. Consequently it means we will get to much higher levels of intelligence faster. Also that inference (using the intelligence) will be a more common good.

  • Nobody knows and investors are guessing. Their confidence has been rocked and uncertainty has been introduced.

  • Nvidia is at a premium because it's so dominant. But the sentiment now is that China can build cheap, quality AI too and presumably they can make top dog or second dog GPUs too.

    US's anti-China policies further forces China to develop their own GPUs, breaking Nvidia's dominance within 10 years or something.

  • it doesn't.

    Market is upset because monopolies are basically busted with opensource AI

  • [dead]

  • [dead]

  • [dead]