Scientists at Hewlett-Packard said that they had developed a new strategy for designing a quantum computer composed of switches of light beams that could be vastly more powerful than today’s digital electronic computers, which are constructed from transistors.
Its difficult to talk about a true quantum computer in terms of switches and transistors, because thats not really what they’re about.
As Nii said, its all statistics.
Yes, there are bits (“qubits”), but they only seem to matter in that they embody the statistical superposition of states that is attempting to be reduced. The size of a qubit, or the equipment to manipulate one is irrelevant, and wont affect the computational power of a quantum computer.
Quantum computers aren’t likely to ever replace current computers, merely supplement them to solve problems that are exponentially complex, such as prime number factoring, or traveling salesmen.
A lot of quantum computation might sound “hand-wavy” at the moment, but there are already completed quantum-computer algorithms (ie. Shor’s algorithm) worked out to solve problems like prime number factoring, and in a comparatively small amount of time.
In quantum computing, units of information called “qubits” can hold both 0 and 1 simultaneously. That capacity is the heart of the vast potential power of quantum computers.
More important is less friction and thus less heat produced as product nearby.Speed is also important since nothing travels faster than light according to our currently avaible standards.
Aren’t current optical sensors currently too slow for this to be practical?
A basic optical sensor is a transistor with it’s head shaved down, sorta like a basic photovoltaic cell.
also, for these to work, don’t you need SPACE between the sensor and light emitter? how big are these things going to be?? 1′ squared?
You create a light source and have it run through a length of fiber. Now you have distance. You can now create many virtual computers by sending multiple pulses. The number of virtual computers is limited by the number of pulses you can detected. But wait, we can add more VC’s by adding more fiber, thusly allowing our sensor to have more time between pulses. Virtual computer will allow both horizontal and virtical scaling. Simply awesome when you think of the possibilities.
You’ve described multi-mode fiber. Note that in order to send multiple signals, they have to operate at different frequencies, in order to operate with longer wavelengths (at slower speeds) the distance between the light emitter and sensor has to increase.
All of this can be done today, but I don’t yet see the “computer” portion, I still see it only as a transmitter / receiver with a regular computer interpretting the data–the light is just a medium for info transfer.
Inorder to have a “light” computer you’d need that setup shrunk down greatly, and not have flash over, where one light sensor will pickup light not intended for it and give wrong results.
To me this doesn’t make sense as a computer yet.
Um plenty of interesting stories pass by with few comments, see those posted earlier.
As for quantum computing, I always think of schrodingers cat at the same time. I guess I will remain sceptical till I see some computer type logic actually working.
Optical computing I could actually get my head around but this quantum stuff sounds like one of Will Crusher’s experiments.
You have to keep in mind that the basic unit of a CPU or of RAM is a switch, a transistor (in most cases now some super modified MOSFET hybrid.)
Currently, Intel is shrinking down to 0.65um per transistor. That is a TINY switch.
In order to be compared, the light emitter + medium (like fiber optic cable) + receiver must all be within one package to make up 1 switch. 1. Transistors are at 0.65um, I can’t imagine these light switches catching up before I’m long dead, about the same time Duke Nukem Forever will be released and Nano-technology will product anything but theories and investments that dont pay off.
I mean 0.065um, big difference.
65 nanometres (nm) would have been correct as well, and would have saved you from he fact that it’s difficult to type the lowercase Greek letter “mu” for “micro”.
Speaking of nanotech, and back on topic, if you could tunnel photons to a nanoscale optical sensor, or use MEMS (Micro ElectroMechanical Systems) with mirrors, if you want to get it done THIS decade, maybe you could create switches small enough to make a functional computer.
Not that it would be easy….
–JM
The ultimate output of quantum computers will be what can be analysed statistically, which is a rather lot, – mathematicians can do more with statistics than you can imagine – but a computer still boils down to having to make classical decisions.
One of the interesting fields that is beginning to be demanded and that quantum mechanics may enable is pattern recognition (including the recognition of patterns in seemingly random data (ie encrypted data)).
A computer (as we know it) will still need a classical decision making machine such as a classical CPU or something, then the quantum computer may be a co-processor for statistical analysis.
A pure quantum computer may also exist but must require complex setting up, such as may be only used in academia giving a statistical answer.
There are already clearly foreseen benefits of quantum computers which are certain to exist, but the future is never what it seems, I am interested in the inexpected advances that no-one can yet imagine
Conventional sense computers?
Unless you would like to re-teach computer programming, computers consist of an accumulator, a source memory space and a desitination memory space.
Computations consist of “adding” one mem location with another.
This is how the most basic computer block works. This is how a computer based on light must work to be useable. Statistics are completely meaningless to computers who’s role is to calculate the absolute: Calculators have to contain SPECIAL instructions to understand what a fraction is, and that 3 1/3 x 3 = 10 and not 9.99999999999999999999999999999999999999999999999999999999999999999999 9999999999999999999999999999999999999999999999999999999999999999999999 9999999999.
Computers are and always will be absolute at the most basic level, solving prime number factoring is a high level computation that is broken down into blocks by the CPU into “add this variable with that variable, result goes here” repeatedly and at extremely highspeed.
There is where a light computer needs to go to be of any use. Otherwise it’s pretty useless.
People focus on the high level things way too much. It’s the basics, whose properties you exploit or expand upon. The basics multiply out and give you unexpected and amazing results.
That there is hardly any news on optical/photonic computing.
You might see one article every 6 months.
Optical /quantum/spintronic/ nanoswitches is going to replace ‘hot’ silicon & copper of today.
Unless you would like to re-teach computer programming, computers consist of an accumulator, a source memory space and a desitination memory space.
Computations consist of “adding” one mem location with another.
People, please not talk about mem locations or optical computers or parallel processing in context of article about quantum computing (QC).
QC is principially different from all current approaches/architectures. QC doesn’t include any computations with mem locations. QC doesn’t include programming in nowadays sense.
This is hard to explain actually. Well, let’s try.
Take for exaple number factoring. (Assume we need find factors of X.) Any “normal” computation algorithms include just trial and error – there’s no formula to factor numbers, nor are there factors known in advance. Usually you need to iterate over all numbers from 1 to sqrt(X). (Of course this is simplified, but exponential nature of this problem is kept.)
But in QC factors are known in advance. How? Beacuse enough lengthy quantum register (QR)(containing same amount of qubits than is length of X in bits) contains all numbers from 0 to X, including the searched factors. To find these, you just need to create right conditions to get information out of QR or two.
What makes this extremely difficult, is
a) creating stable and universal QR,
b) creating right conditions.
a is mostly technological problem, b is both theoretical (algorithms, error correction etc – usually QR readings are kinda statistical, right answers must have somewhat bigger probability than wrong ones) and technological.
Technological here means that operating with single particles (where quantum effects emerge) is impossible, thereby to create quantum bits we need solve “some” problems with creating coherent and stable clusters of particles.
Tell me then, does your quantum computer calculate 456 x 789 any faster than a standard transistor based CPU architecture?
people keep using “factoring” as an example, how about adding? division? decimals?
Break it down to its basic components, you can’t tell me that you’re going to use factoring and guessing ot add 2 numbers and the result will be faster.
Tell me then, does your quantum computer calculate 456 x 789 any faster than a standard transistor based CPU architecture?
Absolutely not. Honestly, QC doesn’t calculate such things at all – and it isn’t meant for it.
Of course, there can be algorithms to make “normal” calculations on QC either – this kind of multiplying should take probably one step. But it would cost thousands or millions of $ today, if possible at all.
Why factoring is provided as example? Because factoring is (most probably) NP class of problem. This means that time to find factors of a given number depends exponentially of number length (digit count). On this fact is based all nowadays cryptography. If big numbers could be factored at few steps (like QC should do), then all MDA/SHA/RSA/whatever keys and algorithms will be useless. (This is one of reasons, why millions of $ are invested into QC research.)
Break it down to its basic components, you can’t tell me that you’re going to use factoring and guessing ot add 2 numbers and the result will be faster.
Look, QC approach of factoring isn’t directly breakable down to classic operations. Well, there are simulator programs, which serialize all states of qubits – of course these programs work in exponential time. QC doesn’t guess numbers – QC just allows to filter right numbers out of all possibles at few steps.
Factoring is used as an example (another example would be the traveling salesman problem), because it is exponentially complex.
To break a modern crypto key would require eons and eons of conventional computer time, just to factor the key numbers. This is where the quantum computer shines, because it can “shake out” the right answer in less-than-exponential time.
It wouldn’t make sense to use a quantum computer for addition or division (though I believe it could be done), because neither of those tasks are complex enough to require it.
Stop thinking of a quantum computer as a replacement for your “leet” AMD/Intel boxes that’ll get “uber” FPS in Doom3, and start thinking of it as a direct application of quantum physics and advanced statistics meant to solve problems that cannot be solved today.
Amen.
To break a modern crypto key would require eons and eons of conventional computer time,
Would take a super computer a hour or so.
If that were really true, then crypto would already be useless.
Notice how long the RSA key-breaking competitions take, and thats with a seti@home-like supercomputer working on a reduced keyspace, and on sub-standard sized keys.
http://www.rsasecurity.com/rsalabs/node.asp?id=2093
For an 8k key (not unheard of), it would still take IBM’s “Blue Gene” ages.
This is what Q-comps are for.
You may model a brain, a storm, an entire atmosphere, galaxy collisions, maybe even thoughts with this kind of technology.
And here, no, you dont go from small bits to complex.
Here you have tools for sollutions.
You set of complex input data, that goes in incredible complex output data, that does not need exponential time to compute(its not even about computation its about analisys).
DonQ’s reply above are very good and show why QC has always been “right around the corner” for the last decade. The various QC proposals are all aimed at exploiting the more esoteric features of quantum theory and don’t lend themselves well to traditional computing.
The problem is that most reporters don’t understand this. If you read the article, you’d get two impressions: first, the author has no background in physics at all; second, the author expects this QC to replace regular computers. These devices (which probably are closer to being built now than ever before) will have very limited uses, but will really kick some butt in those areas.
If you wish to use quantum devices for a “normal” computer, you really need to look at devices using the quantum effect known as “tunneling.” Superconductive junctions (known as Josephson junctions) use tunneling to make small current switches. These are very fast circuits that can be used in the traditional sense of computer science. Their main problem is that it takes special material and extremely cold temperatures to make these work. Also, the material wears outs for repeated heating and cooling cycles.
A newer and better form of tunneling circuit is the PRESFET (planar resonant tunneling field-effect transistor). This device acts as a switch, is not made in layers like other QC devices, uses AlGaAs/GaAs, and doesn’t require anywhere near as much cooling as most quantum devices. What it DOES require is the ability to make circuit features in the range of 10 to 30 nanometers. Given that modern silicon processes are getting ready to move to 65 nm, we are almost to the point of being able to mass-produce PRESFET ICs. Once IC fabs can handle 10 to 30 nm feature sizes, look for a major improvement in traditional computing due to QC devices based on tunneling and electron gas effects.
The one thing I can’t figure out is how, if a qbit is a superposition of both states that collapses randomly to 1 or 0 when you look at it, how do you get results other than noise out of the machine?
I assume there’s some cleverness in controlling the quantum superposition of the particles, but won’t you still end up with a computer telling you, say, ‘A’ 30% of the time, ‘B’ 60% of the time and ‘C’ 10% of the time? (Averaged out to a 30-60-10 spread over tons of calculations, naturally)
In which case, it seems fairly obvious that this wouldn’t replace normal computing where the same situations are expected to produce the same result every time.
Quantum computers are decades away.
They are still interesting: calculations on a quantum computer colud be infinitely parallel.
In fact they would be even more powerful than an infinitely parallel processor, as all possible branches of your code can be executed at once, automatically, at each step. Of course we are far from being able to master this power. But just a bunch of Qbits would be more powerfull than converting all the silicon in the universe into classical computers. Even if we might never need that amount of power, it’s still interesting to understand how nature works at the quantum mechanical level.
Anyone interested in quantum mechanics and computing might find this link interesting:
http://www.quantumconsciousness.org/
give it 50 years…. maybe a little more, maybe a little less, but those of us in our 20’s will probably see it in our lifetime, can you imagine what our parents all thought when they were younger….. computers were nothing more then fiction or huge ridiculous mainframes with punchcards
knowlege increases exponentially…. the more we learn, the faster we learn.. the more we learn and so on
Computers will be the downfall of society….. but yet… i’m still a computer geek lol