This paper is a gentle but rigorous introduction to quantum computing intended for computer scientists. Starting from a small set of assumptions on the behavior of quantum computing devices, we analyze their main characteristics, stressing the differences with classical computers, and finally describe two well-known algorithms (Simon’s algorithm and Grover’s algorithm) using the formalism developed in previous sections. This paper does not touch on the physics of the devices, and therefore does not require any notion of quantum mechanics.
Some light reading before bedtime.
True quant-computation is religious dogma at best, otright nonsense at worst.
Hi,
This paper is written for mathematicians that have no clue about computers or programming.
Either show me code (preferably in assembly language) that does something that real software might actually want to do (e.g. something simple, like waiting for user to type in 2 numbers and displaying the result of adding them, or a silly bubble sort, or a function to do matrix multiplication, or code to send a query to an SQL server, or even just an industry standard “Hello World”); or admit that “quantum computing” (if it ever exists) is a worthless joke (completely unusable for all practical purposes).
– Brendan
Why assembly?
Hi,
Because higher level languages don’t tell you anything about the CPU itself (e.g. “Hello World” in C would look exactly the same for all CPUs).
– Brendan
I wouldn’t say it’s worthless, nevertheless it is somewhat confusing for an average developer: “[this paper] assumes the reader is at ease with linear algebra, and
with basic concepts in classical computing such as Turing machines, and algorithm complexity.” and then it starts right away with complex numbers and tensors products. Quite a challenging read.
Hi,
It’s worthless. They assume useful software consists of one algorithm (that happens to suit quantum computing); and ignore the fact that actual software consists of many algorithms, and input/output (including user interfaces, file IO, networking, etc), and temporary storage (in buffers, arrays, caches), and control flow, and “glue” to hold it all together.
It’s like showing people a wheel and saying “it rolls, therefore you can use this instead of a normal car” while completely ignoring everything else that is necessary for a useful vehicle.
– Brendan
This is just idiotic crap. There have been many important computers that can’t wait for users to put in numbers. The idea that it somehow is a practical requirement just shows how thoroughly stupid and arrogant you are.
Learn something about computing before writing anything else please – this is embarrassing!
Hi,
If you don’t understand the difference between one of many possible examples of “something that real software might actually want to do” and an absolute fundamental requirement; then you lack the minimum amount of intelligence needed to form a valid comment.
– Brendan
Edited 2017-08-16 09:34 UTC
WHERE THE F*CK DO YOU GET THE IDIOTIC IDEA QUANTUM COMPUTERS IN ANY WAY OR FORM ARE INTENDED TO REPLACE NORMAL COMPUTER?!?
They aren’t. For traditional algorithms (many of which will _not_ be accelerated by quantum computers) they don’t make sense. They would be more expensive, they would be too big (they require near-absolute zero temperatures in any practical system), they would be “clocked” much lower and they can’t scale in memory or processing width. This is basics.
You are talking about assembly as if that is something defining a computer. It isn’t. The ISA may be seen as something defining a computer.
Many computers have no assembly language. I have one standing ~5m from me right now.
And to continue there are many computers with no (human) IO at all. It have no bearing on how practical the computer is which anybody that have done even basic reading about computers and computation would know.
And there have been many systems when the main computer have no I/O capabilities in itself instead using secondary I/O subsystems for all external communication.
So I repeat: learn about computers and computer architecture before thinking a quantum computer have to be a replacement for a crap design like IBM PC compatibles.
When all you have is a hammer, every problem looks like a nail.
Virtually everyone in the world is only familiar with ‘regular’ computers. It’s to be expected most of us are completely and utterly ignorant when an entirely different form of computing comes along.
Just like imperative vs. functional programming.
both of which run on normal imperative hardware …
There was functional processor and computers (Symbolics, Genera, …) that could also run imperative programs through some kind emulation, thanks to monads.
You can run quantum programs on imperative hardware too.
a good example is this:
Task – brute force test a password
Normal computing – try each test password, one after another, will take a long time for strong passwords/crypto
Quantum computing – a waveform superposition of all possible passwords will collapse to the correct answer. in one step. no need to try each and every password one after another.
This is why quantum computing scares security people ..
The core of quantum computation is the collapsing of many possible answers into the right set of anwers – instantaneously.
Proper parellism.
Hi,
Anyone with more intelligence than a mouldy potato would have realised that a computer with no IO capabilities of its own would have no way to communicate with a secondary IO subsystem.
– Brendan
You see term “quantum computing” and because it includes the word “computing”, you then equate it to any form of computing? LOL!
The use-cases you initially described have nothing to do with quantum computing. Waiting for user input or reading from a block device isn’t bottlenecked by computational power.
Bubble sort is an algorithm that likely won’t be meaningful with quantum processors just like it isn’t meaningful with GPUs of modern day. At best it can be optimized to be run partially in a CPU and partially in a GPU… Now try replacing ‘GPU’ with ‘Quantum Processing Unit’ in the previous sentence and take a moment to think about it.
Edited 2017-08-16 16:56 UTC
‘Something that real software might actually want to do` is the issue here. Quantum computing is only a win if you’re doing something that benefits from the massive inherent parallelization it provides. Cryptography and purely computational code are about the only things that we currently have quantum capable algorithms for, and just about everything else is going to run slower on a quantum computer because of the required error correction and other factors.
Put slightly differently, there’s no point doing most of the stuff you mention on a quantum system, just like there’s no point in trying to write networking code to run on a graphics processor or a block cipher for a touch controller. Just like those particular cases, quantum computing is and will (for the foreseeable future at least) continue to be a co-processor technology, not a CPU technology.
You might be interested in this Quantum Supersampling algorithm by Eric Johnston
http://machinelevel.com/qc/siggraph/hello_siggraph_2016.html
Whenever a new paradigm happens, people who grew up and shaped in the current/old one usually tend to try to understand the new idea in terms of the old, and in doing so, in many cases, they miss the point altogether.
When the internal combustion engine was being developed, many people who had grown up with horses could not see its utility. After all, where do you feed the hay, or put the saddle, on those things?
I think the big problem is many people (even scientists, but ESPECIALLY reporters) confuse quantum computing with quantum computers. Quantum computing should REALLY be called quantum algorithms, and serve as a VERY SPECIALIZED form of hardware accelerator for certain specific classes of algorithms. They DO NOT serve as any form of a general computational device. They require normal computers to feed them data and make sense of the results.
Quantum computers use specific hardware that takes advantage of quantum properties to make generic processing structures. For example, one of the most basic structures in computer logic is the gate/switch. A nanometer scale structure that relies on tunneling (a quantum effect) can be used to form a gate or switch. This allows for all the normal general purpose logic operations people associate with computers. In this case, you are making a quantum computer. It in no way does “quantum computing” – it does bog-standard logic using quantum structures.
Waiting for Quantum Leap by Dr. Sam Beckett and Admiral Al Calavicci.
Everytime I think I am quite the intelligent guy, an article like this comes along to remind me that I am dumber than 2 boxes of rocks.
I didn’t read the article but I do have a degree in computer science and I have been studying quantum computing.
Quantum computing for the next several years will not be general purpose computing with programming languages. Quantum computing will only compute quantum algorithms using specific quantum operations that correspond to modifying the spin of entangled particles.
IBM currently has a quantum computer chip with 4 qubits which you can simulate and run programs on from the web. They are also introducing a 15-qubit and 16-qubit chip as well.
Writing programs on an IBM quantum computer looks like scoring music where each note corresponds to a quantum instruction. This would be the equivalent of coding in assembly language. But it is not general purpose and programming it requires advanced math and/or quantum physics.
https://www.research.ibm.com/ibm-q/learn/what-is-quantum-computing/