Discussion about this post

User's avatar
Suman Suhag's avatar

Simple binary computing sequentially takes command data from memory and then performs the logical operations coded in the command data. The https://en.wikipedia.org/wiki/No-cloning_theorem) rules out this approach for quantum information.

The standard approach for quantum computing is as a flow of multiple qubits through a sequence of quantum gates.

In this model, a binary computer is used to control the selection of quantum gates and the initial input to the quantum pipeline.

While there would be no performance improvement obtained, computing in this way with only traditional computing elements would be by providing an auxillary box to which the programmer would send binary words and the box would process the binary words to produce logical outputs.

While this model would make no sense using traditional binary bits, the logic of qubits makes this format attractive for quantum computing because of the unique talent for quantum logic to process entangled information. Using quantum logic is like finding needles in a haystack without checking every straw.

Suman Suhag's avatar

A bit of history: 1st: ‘computers’ were people trained to solve complex problems (see movie “hidden figures”) 2nd: computers used to be big and expensive. They needed specially trained people to operate them. Then came mini-computers which took up less space, but had limited function and were used to control machines (look up Naked Mini). Then Intel decided rather than use a complex custom circuit to use a programmable system to, if my memory server, run a Mainframe disk drive. A couple of hobbyists wrote an article for Popular Electronics using that processor and a bunch of 100 pin connectors (because they found those on sale for the 50 or so kits they thought they would sell) They got a thousand orders within a week and the hobby computer craze started. A couple of kids in California though they could sell pre-built hobby computers for those who didn’t want to solder their own and allowed others to write programs for it (you may have heard of that company…Apple).

Up until this point if you wanted more than one person to have access to a computer (or wait in line for hours to access the mainframe, you had to connect using a simple keyboard and monitor setup.

While we now can have in our hands computers which are far more capable than the old mainframes there are somethings that your desktop can’t provide: reliability and scale. A computer center (whether its company owned, shared or a cloud center) has is redundant communications links, 24/7 power with regularly tested backups and the reliable hot swappable components. On a mainframe you can replace processors, memory , mass storage without shutting anything down. On a smaller scale, what looks like a tower computer, if its a server will have redundant power supplies, error correcting memory and mass storage which can support disk failure with replacement disk able to be swapped in and the system restored in the background. What happens when say the redundancy fails, major airlines have to stand down operations for several days. What happens if the bank’’s data center fails, all their ATMs stop issuing money.

4 more comments...

No posts

Ready for more?