“A computer chip that performs calculations using probabilities, instead of binary logic, could accelerate everything from online banking systems to the flash memory in smart phones and other gadgets. Rewriting some fundamental features of computer chips, Lyric Semiconductor has unveiled its first “probability processor,” a silicon chip that computes with electrical signals that represent chances, not digital 1s and 0s.”
A field they’ve not touched on, but I could see this chip being utilised is in developing “human-like” artificial intelligence
We haven’t even figured out a way to get programs to behave the intended way with a CPU that does exactly what you tell it to and they want us to write code for a CPU with probabilistic behavior?
The only use I can currently think of is generating random numbers. But only in the form a a few special instructions in an otherwise ordinary CPU.
First step towards the Infinite Improbability Drive
Usually 1’s and 0’s are just different voltages on a line. I think that is a waste. Instead, we should be able to store thousands/millions/billions of voltages, and thus allow a tremendous boost in storage capacity – and totally new kinds of circuits/logic design. At the infinite level (as n -> infinity), we get back to analog computers!
The more voltage steps there are, the harder it is to robustly distinguish them. I remember reading something about 5V TTL, and someone did a calculation as to how many voltage levels would be reliable, and it came out between 2 and 3.
An example of the difficulty with using more than two voltage levels is MLC flash memory. SLCs store two levels per cell, so to store data, you just fully drain or fully charge the cell. ECCs take care of any bits that don’t hold charge properly. With MLCs, you have to have four distinct voltage levels, requiring that the charge be tuned carefully. To write a cell, you drain it, and then start pumping it up gradually until it reaches the charge level that corresponds to the two bits being stored. The storage is much less robust, requiring much more expensive (in terms of redundant bits and decode time) ECC codes. This is why MLCs are so much slower to write and have so much higher latency for reading.
Dial-up modems try to use 128 of the 256 available dynamic levels available from digitally encoded POTS lines. When you make a land-line call, your voice gets digitally encoded (mu-law or alpha-law, I think) to 8-bit samples at 8KHz. The voice you hear from the other end has been converted back to analog. Thus, the max theoretical bandwidth is 64000 bits/sec. The levels aren’t linearly spaced, so modem encodings use only 128 levels, hence the 56KBPS modems. Unfortunately, that 56K is not achievable in part because you can’t rely on good line fidelity. Thus, encoding methods like Trellis Modulation are used, which use AI techniques to encode and decode forward error correcting codes so as to try to get a clean transmission at a lower bit rate.
Also, consider what it would take in terms of circuitry to handle thousands of voltage levels in “multi-level digital logic”. CMOS transistors aren’t linear when operating in saturation. What you’d probably need to do is throw a lot of extra transistors are the circuitry. You’re probably much better off just going with binary. The circuits would be smaller, faster, more reliable, and use less energy.
Did anyone else think of skynet while reading this?
Edited 2010-08-19 14:44 UTC
Hi,
We’ve been there, we’ve done that, we realised it sucked, we switched to digital. Of course that was about 50 years ago and suckers with money will fall for anything…
http://en.wikipedia.org/wiki/Analog_computer
– Brendan
Certainly, that was then…
But now we have much better fabrication methods, signal processing that can control noise. Either way this chip is designed for specialized purposes replacing use of the fpu which requires far more power to do the same function of this chip.
It is like saying that putting the gpu back with the cpu is so new it’s old
Yes, manufacturing processes are improving. But signal/(noise+variability) is decreasing even faster. That’s the one of main reasons why supply voltages no longer scale down proportionally to the process node.
The result is that the gap between analog and digital circuits is ever increasing. Analog circuits sometimes need only a few transistors to implement some function but their size and power consumption is often larger than that of a digital block implementing equivalent algorithm.
Analog circuits are actually very common in modern IC’s but they are used either when they have to be used (e.g. for interfacing to high speed analog signals or in A/D and D/A converters) or as a very specific optimization technique (high speed logic, memory readout etc) sacrificing other performance figures, usability, process scalability etc.
I’m a bit skeptical about the chip in question. Not that they can’t do it, just there is not enough information about their technique to evaluate its performance characteristics, robustness, yield, scalability and so on.
Besides, if there is really such a demand for statistical processors, where are all these dedicated digital ASIC’s fulfilling this niche? Why wait several years to get calculation speed up of x10000 from a potentially unreliable analog processor if you could get a x1000 faster dedicated digital chip today for a fraction of the cost?
so they want to build a fuzzy logic processor in hardware.the application in terms of viison systems seems interesting , tho writing code for it is gonna be a pain
While finances have stopped most of my crunching for now, my first thought would be how well this would work for distributed computing, particularly something like protein folding simulation. Folding@Home has been one of the better projects for supporting multiple platforms and looking into new hardware (PS3 and GPU worked well). From my (admittedly limited) understanding, it seems protein folding and energy mapping wold benefit, helping projects like F@H, POEM, Docking, and some of the Wold Community Grid projects. Even if this is only a niche market processor, some niches provide a lot of scientific value.
…We should just clone brain cells and create bio-chips…
Or a mentat from Dune.
If I remember well, I heard about something like this ages ago. It was called fuzzy logic or something like that…
It is indeed called that, and it’s extensively used in expert systems and all kind of decision systems, in IA, etc. Moving it to hardware may be a good idea, although not unlike quantum computing you’ll have to rethink every algorithm in a “fuzzy” way, and it won’t be useable nor interesting for every task. Probably as a coprocessor of some kind.
Part of the logic here being that silicon based transistors can’t shrink much more, one of the problems being more and more errors in the system. So the proper conclusion they made here was that if a deterministic system can’t be guaranteed then just assume the entire system can’t be trusted and design from there.
That’s an active field of research and I agree that variability and plain reliability is a big problem. But it doesn’t look like what the company is doing. They advertise a kind of a specialized analog(?) statistical coprocessor (currently just a single cell) for application in statistical computing.
It doesn’t really say how they’re doing this. Some commentors seem to assume that it is based on analog systems, but I wonder weather it may be based on pulse stream arithmetic.
The company says that the cell function is derived from the characteristics of single devices. That implies analog.
If you look inside a modern CPU or memory chip you’ll find that they are often using non-rail-to-rail or “pulsed” logic signals. High performance ALU’s are generally implemented using dynamic circuits, which operate in a sort of “pulsed” way, memories use reduced voltage swing to increase readout speed and lower power etc. That’s also a form of analog circuitry, although used in otherwise digital IC’s. These techniques don’t scale well so general purpose digital functions are standard cell based.
This must be some sort of a scam because we have been hearing a similar concept all the time to no avail. What do the new attempt bring to the table that is not available before?
If you were wondering what other statistical algorithm I was thinking about, I am talking about quantum computing. It is essentially the same concept, just implemented differently.
Analogue computing had been out for a reason — precision and discreteness are important.