Data manipulations alleged in study that paved way for Microsoft's quantum chip

  • This is such a beautiful theoretical idea (a type of "natural" error correction which protects the qubits without having to deal with the exorbitant overhead of error correcting codes). It is very disheartening and discouraging and just plain exhausting that there has been so much "data manipulation" in this subfield (see all the other retracted papers from the last 5 years mentioned in the article). I can only imagine how hard this must be on the junior scientists on the team who have been swept into it without much control.

  • Looking at the paper, cherry picking 5 out of 21 devices is in itself not a deal breaker IMO, but it's certainly something they should have disclosed. I bet this happens all the time with these kinds of exotic devices that take almost a year to manufacture only for a single misplaced atom to ruin the whole measurement.

    Average of positive and negative Vbias data and many other manipulations are hard to justify, this reeks of "desperate PhD needed to publish at all costs". Yet at the same time I wouldn't fully disqualify the findings, but make the conclusion a lot weaker "there might be something here".

    All in all, it's in Microsoft's interests that the data is not cooked. They can only ride on vaporware for so long. Sooner or later the truth will come out; and if Microsoft is burning a lot of cash to lie to everyone, the only loser will be Microsoft.

  • As far as I can tell the only thing >25 years of development into quantum computing implementations has resulted in is the prodigious consumption of helium-3.

    At least with fusion we've gotten some cool lasers, magnets, and test and measurement gear.

  • When non-tech people ask me whether they should invest in quantum computing I tell them nobody alive today will live to see a return on their investment in quantum computing. Not because it's impossible but because the engineering challenges to make it truly useful outside the laboratory are too great, despite what the press releases would have you believe.

    I tell them the same thing about fusion energy.

  • Unpopular opinion I'm sure, but I very much quantum today as smoke and mirrors. I've tried to dive down that rabbit hole and I keep finding myself in a sea of theoretical mathematics that seems to fall into the "give me one miracle" category.

    I expect this won't be the last time we hear about quantum research that has been foundational to a lot of work turns out to have been manipulated, or designed poorly and unverified by other research labs.

  • Sabine was already skeptical in February [0]. Although to be fair, she usually is :) But in this field, I think it is warranted.

    [0]: https://backreaction.blogspot.com/2025/02/microsoft-exaggera...

  • Based on the comments in this thread... Guys, Microsoft fuckery doesn't invalidate an entire field.

    I think certain VCs are a little too optimistic about quantum computing timelines, but that doesn't mean it's not steadily progressing. I saw a comment talking about prime factorization from 2001 with some claim that people haven't been working on pure quantum computing since then?

    It's really hard. It's still firmly academic, with the peculiar factor that much of it is industry backed. Google quantum was a UCSB research lab turned into a Google branch, while still being powered by grad students. You can begin to see how there's going to be some culture clash and unfortunate pressure to make claims and take research paths atypical of academia (not excusing any fraud, edit: also to be very clear, not accusing Google quantum of anything). It's a hard problem in a funky environment.

    1) it's a really hard problem. Anything truly quantum is hard to deal with, especially if you require long coherence times. Consider the entire field of condensed matter (+ some amo). Many of the experiments to measure special quantum properties/confirm theories do so in a destructive manner - I'm not talking only about the quantum measurement problem, I'm talking about the probes themselves physically altering the system such that you can only get one or maybe a few good measurements before the sample is useless. In quantum computing, things need to be cold, isolated, yet still read/write accessible over many many cycles in order to be useful.

    2) given the difficulty, there's been many proposals for how to meet the "practical quantum computer" requirement. This ranges from giving up on a true general purpose quantum computer (quantum annealers) to NV vacancies, neutral/ionic lattices, squid/Josephson based,photonic, hybrid system with mechanical resonators, and yeah, topological/anyon shit.

    3) It's hard to predict what will actually work, so every approach is a gamble and different groups take different gambles. Some take bigger gambles than the others. Id say topological quantum was a pretty damn big gamble given how new the theory was.

    4) Then you need to gradually build up the actually system + infrastructure, validating each subsystem then subsystem interactions and finally full systems. Think system preparation, system readout, system manipulation, isolation, gate design... Each piece of this could be multiple +/- physicist, ece/cse, me, CS PhDs + postdocs amount of work. This is deep expertise and specialization.

    4) Then if one approach seems to work, however poorly*, you need to improve it, scale it. Scaling is not guaranteed. This will mean many more PhDs worth trying to improve subsystems.

    5) again, this is really hard. Truly, purely quantum systems are very difficult to work with. Classical computing is built on transistors, which operate just fine at room temperature*(plenty of noise, no need for cold isolation) with macroscopic classical observables/manipulations like current, voltage. Yes, transistors work because of quantum effects, and with more recent transistors more directly use quantum effects (tunneling). For example, the "atomic" units of memory are still effectively macroscopic. The systems as a whole are very well described classically, with only practical engineering concerns related to putting things too close together, impurities, heat dissipation. Not to say that any of that is easy at all, but there's no question of principle like "will this even work?"

    * With a bunch of people on HN shitting on how poorly + a bunch of other people saying its a full blown quantum computer + probably higher ups trying to make you say it is a real quantum computer or something about quantum supremacy.

    *Even in this classical regime think how much effort went into redundancy and encoding/decoding schemes to deal with the very rare bit flips. Now think of what's needed to build a functioning quantum computer at similar scale

    No, I don't work in quantum computing, don't invest in it, have no stake in it.

  • that's going to be a banger bobbybroccoli video

  • So the chip is a paperweight ?

  • Sadly I have the feeling some people are starting to just "play" being scientists/engineers and not actually doing the real work anymore.

  • [flagged]

  • intellectual property is the entire point of modern tech. it doesnt matter if it doesnt work. they want the IP and sit on it. that way if someone else actually does the work they can claim they own it.

    repeal IP laws and regulate tech.

  • Looks like the end of the world has been delayed.

  • Microsoft's finest, ladies and gentlemen.

  • I bet you quantum computing will go the way of Newtonian physics - wrong and only proven so by our ability to measure things.

    It's as if Newton insisted that he was right despite the orbit of mercury being weird, and blaming his telescope.

    Physics is just a description of reality. If your business depends on reality being the same as the model, then you're going to have an expensive time.