Quantum Computing?



  • I have been seeing more and more mention of quantum computing in the tech news sector lately.
    Today's example: http://gizmodo.com/physicists-smash-quantum-teleportation-record-with-60-m-1732525396

    I have a degree in Computer Science, but, unsurprisingly, it didn't cover quantum computing.

    I would like to get up to speed on this topic, as it seems like it is gaining more relevance. Can anyone recommend a good book or website that will walk me through it tutorial or introductory 101 class-style?

    I am not looking to become an expert in the field, or to be able to work in the field. I would just like to be able to hold my own when discussing it. A casual search on Amazon for quantum computing brought up this book that seems promising: http://www.amazon.com/Quantum-Computing-Introduction-Engineering-Computation/dp/0262526670/ anyone read it?



  • I keep trying to get a feel for the concepts. And then I always hit qubits, and get confused again. Quantum is weird.



  • Simplified --

    A traditional computer works linearly - it transitions from one state to another and uses the state transitions (in the form of algorithms) to compute solutions to problems. Twice the number of state transitions takes twice the amount of time.

    A quantum computer relies on the fact that, at the quantum level, a particle is simply the summation of its probabilities. It is everywhere it could be, until forced to a state by being observed. A quantum computer is essentially an experiment where the state being selected is the state that solves the problem. In theory, the particles should have some probability of occupying the solution configuration, so they will configure themselves thusly instantaneously.

    The major feature of quantum computing is that problems with more steps don't take longer to solve than problems with fewer steps. This breaks the fundamental assumption that modern cryptography is based on.



  • @Jaime said:

    Simplified --

    A traditional computer works linearly - it transitions from one state to another and uses the state transitions (in the form of algorithms) to compute solutions to problems. Twice the number of state transitions takes twice the amount of time.

    A quantum computer relies on the fact that, at the quantum level, a particle is simply the summation of it's probabilities. It is everywhere it could be, until forced to a state by being observed. A quantum computer is essentially an experiment where the state being selected is the state that solves the problem. In theory, the particles should have some probability of occupying the solution configuration, so they will configure themselves thusly instantaneously.

    The major feature of quantum computing is that problems with more steps don't take longer to solve than problems with fewer steps. This breaks the fundamental assumption that modern cryptography is based on

    Yes, that is more or less what my understanding of it is as well. I would like the dig a little bit deeper, though :)


  • BINNED

    @Magus said:

    Quantum is weird.

    That's the most important thing you need to know about quantum physics.



  • I think the most interesting implications of quantum computing are in the field of cryptography (although that may be my personal bias) - if you can stomach it, I'd start there. They often explain various facets of quantum computing in relation to how it interacts with (cryptographic) math, which gives some nice real-world examples to base your learning off of.



  • @rc4 said:

    I think the most interesting implications of quantum computing are in the field of cryptography (although that may be my personal bias) - if you can stomach it, I'd start there. They often explain various facets of quantum computing in relation to how it interacts with (cryptographic) math, which gives some nice real-world examples to base your learning off of.

    That's all well and good, but what book/website/learning-resource do I use? That is the underlying problem I am trying to solve.


  • Considered Harmful

    Start with Turing. You get to consider an NTM rather than a DTM as your model.



  • @Gribnit said:

    Start with Turing

    ...or... start by looking into mechanical computers. That will get you out of thinking of a solution in terms of a program and into thinking in terms of modeling. A quantum computer is a lot like a mechanical computer, except the parts can never be viewed directly.



  • @Jaime said:

    This breaks the fundamental assumption that modern cryptography is based on.

    IIRC that should be: "that some modern cryptography is based on". Mostly it's the public-key asymmetric stuff that's actually at risk.



  • Don't expect too much. Quantum computers can't magically ignore the laws of physics either.

    As a quantum computer works with wave functions, and waves have to form, the system needs a certain time to relax to sufficiently distinguishable states. Lower limit: tr = ℏ/2 / ΔE where ΔE is the minimal energy difference between any two states we want to be able to tell apart.


  • 🚽 Regular

    I see what happened there.

    Quantum Discourse: see ℏ in preview, see ℏ in baked post.



  • Maybe they have a whitelist of allowed html entities and the reduced Planck constant was somewhat suspect for introducing uncertainty into such a perfect product?



  • Quantum WYSIWTF: You can never tell how a post will look like before you actually render it.



  • @Rhywden said:

    introducingdisclosing uncertainty into such a perfeucketd product

    FTFY


  • Discourse touched me in a no-no place

    @PWolff said:

    WYSIWTF

    👍



  • @PWolff said:

    Quantum WYSIWTF: You can never tell how a post will look like before you actually render it.

    Sounds like WYSMBWYGIYL.



  • @abarker said:

    Sounds like WYSMBWYGIYL.

    Yes, but it is shorter and more memorable, at least to me.

    Filed under: matter of taste



  • @PWolff said:

    @abarker said:
    Sounds like WYSMBWYGIYL.

    Yes, but it is shorter and more memorable, at least to me.

    Filed under: matter of taste

    :doing_it_wrong:



  • Okay. I'm DIW™ and proud of it.



  • @Magus said:

    then I always hit qubits, and get confused again. Quantum is weird.

    Think of quantum phenomena as temporary suspensions of causality. Set up some preconditions and you will eventually get a postcondition that is consistent with the previous existence of those preconditions; but you can never actually pin down the preconditions as the cause of the postconditions.

    If I understand the idea behind quantum computing correctly, it involves setting up relationships between qubits in such a way that the only postconditions after a QC run that could possibly be consistent with its preconditions amount to the result of a successful search across the entire input space; essentially equivalent to testing N possible solution candidates on N traditional cores in parallel. So a quantum computer with 32 qubits should be able to do things that, given conventional hardware, would require four billion parallel cores.

    I suspect, but am in no way equipped to prove, that QC will run into noise-related difficulties that get exponentially worse as the number of interdependent qubits rises linearly; I don't expect the technology to live up to its crypto-destroying hype.



  • In case anyone was wondering, I ordered the book in my original post -- it wasn't good. I should have read through the sample. It was all theory and very little application. So, I read through the sample and decided this book was better: Quantum Computing for Computer Scientists

    I have ordered it, I will report back once I have dug into it a bit.


  • Discourse touched me in a no-no place

    @flabdablet said:

    noise-related difficulties

    That's the core of the decoherence problem. The longer you need to keep the system isolated and the more operations you need to apply to it, the harder it is to keep the outside world from interacting in some unexpected way and filling your answers with the quantum equivalent of “I am a fish.



  • As an update to this thread, since people are resurrecting it. I have the new book, and it is MUCH better. Much less theory, and much more -- "this is how it would work, and here is the analogous way it works now in modern computer science." 👍


Log in to reply