World's First Proof that Consciousness is Nonlocal

Welcome to my blog! I am the author of the world's FIRST paper (explained here on my YouTube channel ) to appear in the academic lite...

Thursday, May 20, 2021

Quantum Computing is 99% Bullshit

 

In this post, just before beginning a class on quantum computing at NYU, I predicted that scalable quantum computing ("SQC") is in fact impossible in the physical world.

I was right.

And I can finally articulate why.  The full explanation (“Scalable Quantum Computing is Impossible”) is posted here and in the following YouTube video.

Here is the general idea.  Let me make a few assumptions:

·       A system is not “scalable” in T (where T might represent, for example, total time, number of computation steps, number of qubits, number of gates, etc.) if the probability of success decays exponentially with T.  In fact, the whole point of the Threshold Theorem (and fault-tolerant quantum error correction (“FTQEC”) in general) is to show that the probability of success of a quantum circuit could be made arbitrarily close to 100% with “only” a polynomial increase in resources.

·       Quantum computing is useless without at least a million or a billion controllably entangled physical qubits, which is among the more optimistic estimates for useful fault-tolerant quantum circuits.  (Even "useful" QC isn’t all that useful, limited to a tiny set of very specific problems.  Shor’s Algorithm, perhaps the most famous of all algorithms that are provably faster on a quantum computer than a classical computer, won’t even be useful if and when it can be implemented because information encryption technology will simply stop making use of prime factorization!)

o   There are lots of counterarguments, but they’re all desperate attempts to save QC.  “Quantum annealing” is certainly useful, but it’s not QC.  Noisy Intermediate-Scale Quantum (“NISQ”) is merely the hope that we can do something useful with the 50-100 shitty, noisy qubits that we already have.  For example, Google’s “quantum supremacy” demonstration did absolutely nothing useful, whether or not it would take a classical computer exponential time to do a similarly useless computation.  (See the “Teapot Problem.”)

Given these assumptions, what do I actually think about the possibility of SQC?

First of all, what reasons do we have to believe that SQC is possible at all?  Certainly the thousands of peer-reviewed publications, spanning the fields of theoretical physics, experimental physics, computer science, and mathematics, that endorse SQC, right?  Wrong.  As I pointed out in my last post, there is an unholy marriage between SQC and the Cult of U, and the heavily one-sided financial interest propping them up is an inherent intellectual conflict of interest.  Neither SQC nor FTQEC has ever been experimentally confirmed, and even some of their most vocal advocates are scaling back their enthusiasm.  The academic literature is literally full of falsehoods, my favorite one being that Shor’s Algorithm has been implemented on a quantum computer to factor the numbers 15 and 21.  (See, e.g., p. 175 of Bernhardt’s book.)  It hasn’t. 

Second, SQC depends heavily on whether U (the assumption that quantum wave states always evolve linearly or unitarily… i.e., that wave states do not “collapse”) is true.  It is not true, a point that I have made many, many times (here here here here here here etc.).  Technically, useful QC might still be possible even if U is false, as long as we can controllably and reversibly entangle, say, a billion qubits before irreversible collapse happens.  But here’s the problem.  The largest double-slit interference experiment (“DSIE”) ever done was on an 810-atom molecule.  I’ll discuss this more in a moment, but this provides very good reason to think that collapse would happen long before we reached a billion controllably entangled qubits.

Third, the Threshold Theorem and theories of QEC, FTQEC, etc., all depend on a set of assumptions, many of which have been heavily criticized (e.g., Dyakonov).  But not only are some of these assumptions problematic, they may actually be logically inconsistent… i.e., they can’t all be true.  Alicki shows that noise models assumed by the Threshold Theorem assume infinitely fast quantum gates, which of course are physically impossible.  And Hagar shows that three of the assumptions inherent in TT/FTQEC result in a logical contradiction.  Given that FTQEC has never been empirically demonstrated, and that its success depends on theoretical assumptions whose logical consistency is assumed by people who are generally bad at logic (which I’ve discussed in various papers (e.g., here and here) and in various blog entries (e.g., here and here)), I’d say their conclusions are likely false.

But here’s the main problem – and why I think that SQC is in fact impossible in the real world:

Noise sometimes measures, but QC theory assumes it doesn't.

In QC/QEC theory, noise is modeled as reversible, which means that it is assumed to not make permanent measurements.  (Fundamentally, a QC needs to be a reversible system.  The whole point of QEC is to “move” the entropy of the noise to a heat bath so that the evolution of the original superposition can be reversed.  I pointed out here and here that scientifically demonstrating the reversibility of large systems is impossible as a logical contradiction.)  This assumption is problematic for two huge reasons.

First, measurements are intentionally treated with a double standard in QC/QEC theory.  The theory assumes (and needs) measurement at the end of computation but ignores it during the computation.  The theory's noise models literally assume that interactions with the environment that occur during the computation are reversible (i.e., not measurements), while interactions with the environment that occur at the end of the computation are irreversible measurements, with no logical, mathematical, or scientific justification for the distinction.  This is not an oversight: QEC cannot correct irreversible measurements, so proponents of QEC are forced to assume that unintended interactions are reversible but intended interactions are irreversible.  Can Mother Nature really distinguish our intentions?  

Second, and more importantly, the history and theory of DSIEs indicates that noise sometimes measures!  All DSIEs have in fact depended on dispersion of an object’s wave packet both to produce a superposition (e.g., “cat” state) and to demonstrate interference effects.  However, the larger the object, the more time it takes to produce that superposition and the larger the cross section for a decohering interaction with particles and fields permeating the universe.  As a result, the probability of success of a DSIE decays exponentially as the square of the object’s mass (p ~ e^(-m2)), which helps to explain why despite exponential technological progress, we can't yet do a DSIE on an object having 1000 atoms, let alone a million or a billion.  What this means is that DSIEs are not scalable, and the fundamental reason for this unscalability – a reason which seems equally applicable to SQC – is that noise at least sometimes causes irreversible projective measurements.

This is fatal to the prospect of scalable quantum computing.  If a single irreversible measurement (even if such an event is rare) irreparably kills a quantum calculation, then the probability of success decays exponentially with T, which by itself implies that quantum computing is not scalable.  But DSIEs demonstrate that not only does noise sometimes cause irreversible measurement, those irreversible measurements happen frequently enough that, despite the very best technology developed over the past century, it is practically impossible to create controllably highly entangled reversible systems larger than a few thousand particles.  

Quantum computing is neither useful nor scalable.  

9 comments:

  1. Just stumbled across your article. Thanks you have confirmed what I have thought for many years, QC (as we are currently pursuing) will never work. Maybe in another 100 years when we have invented technologies, physical mechanisms to physically construct them. In the mean time I will continue trusting good old 1's and 0's.

    ReplyDelete
    Replies
    1. Thanks for the note! Actually what I've tried to show is that another 100 years of technology will do no good, as scalable quantum computing is fundamentally impossible.

      Delete
  2. Finally! Thank you confirming what I've thought all along. There's no possible thanks I could explain it it the term you have. So thank you!

    ReplyDelete
  3. Andrew, I’ve thought for a long time that QC was BS but more for because of the superposition necessary for make things work. it seems to me that using a 3 valued bit would be only slightly different to using 2 normal bits, one representing 0 or 1, and the other representing the “superposition” value (or not being superposition) i.e. the 3rd value of the trinary bit system. Certainly there would be nominal housekeeping overhead but this could be implemented as an abstraction layer. Am I missing something here? And since we’re linking 2 bits together anyway, why not just treat it as a quadrinary system with the same scheme outlined above - certainly this would crack RSA 102,400,000 in 10^-99999999 seconds flat on my 486DX2.

    ReplyDelete
    Replies
    1. It's because of the exponetial increase in possible values-
      changing from 2 possible values to 3 doesn't seem like much.
      But when you multiply across 1 billion transistors the possible configurations (or say, computations), does up massively.

      Something like 2^ billion vs 3^ billion

      Delete
  4. Thanks for publishing this. As a 30 year old computer engineer looking for a technical field to dedicate the rest of my working life to, I'm interested in quantum computing as a possibility. However I can't convince myself that it isn't bullshit.

    I'm curious about your experience with the research literature surrounding Shor's. I've only read the original paper. He doesn't seem to acknowledge that the QFT would have to be a series of measurements of separate qbit-register superpositions. (In a short video he suggests his algorithm should be thought of as a computational interferometer or diffraction gradient, which just seems inaccurate to me.) The last section apparently suggests the benefit is that you have a "high" probability of getting an answer from which you can deduce the periodicity without having to build the probability function experimentally.

    Notably he also doesn't mention needing any extra qbits in the register which you show in your demonstration factoring 15. Of course his excuse is that he's a mathematician, not a physicist nor engineer. Can you recommend any mainstream subsequently published papers which suggest a "practical" implementation I could follow?

    ReplyDelete
    Replies
    1. Glad you enjoyed. There is no feasible or practical implementation of Shor's algorithm. A few years ago Preskill published a paper on "NISQ," which basically admits that the best we'll do in the near term is noisy quantum crap. Here's the paper, but please remember that Preskill is bullish on quantum computing, so be very skeptical: https://arxiv.org/abs/1801.00862

      Delete
  5. Thanks for writing this; the reversibility aspect is something little mentioned after the 90s era of QC wankery: it is tremendously important and it seems people have collectively forgotten about it. If we could build classically reversible computers (a much easier task; no messy entanglement required), we'd also be able to make some interesting calculations. Toffoli and those guys wrote about this back in the day. Kind of inconvenient keeping all those bits around though.

    ReplyDelete

All comments are moderated. After submitting your comment, please give me 24 hours to approve. Thanks!