Why do computers understand only 0 & 1 logic?

6Responses

Write your response…

This answer has received 3 appreciations.

Here's a little bit of history for you. There was a time, when electronics were just resistors, vacuum tubes, capacitors, etc. Clever people built different things, like light bulbs and electromotors. However, the problem was, that logic, like math, was really difficult to map, especially since signals weakened in circuits. Materials at the time weren't as pure and refined as we can make them today, so some rough way to tell defined signals apart had the be developed. The idea: establish voltage-levels to transmit different information. For example 0-1V: 0, 1-3V: 1, 3-5V: 2, etc. Systems like that were created (see trinary logic a.s.), but soon (very soon) afterwards, people found out that they are difficult to work with (logic-wise and because the voltage is not stable at all). So they decided to simplify the whole idea: There should only be two levels - ~0V and anything else.

We could, in theory, go back to more than those two levels, however more levels means a nonlinear increased complexity. At the moment companies try to slowly tackle that problem. SLOWLY. For example, Samsung is experimenting with TLC NAND SSDs, which can store three bits per cell (8 levels). The first consumer SSDs with this technology were the 840 series. Another example for multi-state information handling are Qubits, which can store up to two bits (4 states) of information per Qubit. As you know, however, quantum computing also still is in its early stages.

Write a reply...

This answer has received 1 appreciation.

Hipkiss said that we are humans, and he's right. In fact the first computers we tried to build were based on the human-friendly decimal system. Wikipedia knows more.

So why didn't we keep on using 10 digits, so easy to understand? It didn't work well.

Let's say you were to build one of these computers. You have a copper wire, and you have to transmit, using a single current "shot", a signal that can represent 10 values.

You may map no current to 0, up to 10V to 1, up to 20V to 2, up to 30V to 3 and so forth.

The problem with this scheme is that keeping the current stable is very hard. Current fluctuates all the time. You would end up make wrong readings so frequently that your computer would be useless. Now, there are ways to improve the signal quality, but your risk of adding more complexity to keep the signal stable, than to do processing itself.

So, we have fall back to binary. Approximately 0V is read as binary 0 (there's tolerance up to 0.8V IIRC) and approximately 5V is read as 1 (anything more than 2V, actually). There is a no-fly zone between 0.8V and 2V. Any signal in that area cannot be interpreted reliably. What happens next is implementation dependent. Reboot? Retransmit? Halt?

Write a reply...

This answer has received 1 appreciation.

Because we humans built them. It's to represent a switch basically - on/off. 2 states.

Quantum computing will offer multiple states.

I like the simple but effective answer.

Write a reply...

Primarily because they are made of electronic components.

It's not that they can't be designed to understand non-binary digits, it's just that signals are easier to transmit if there are only 2 levels

The signals work in such a manner that a specific level of voltage is meant to depict a 1 and another voltage level indicates a 0.

There are significant drawbacks, if the number of voltage levels for transmitting signals is more than 2, hence binary system is used in all electronic components.

Write a reply...

We have designed & dictated them to work that way.

We have designed a system simulating discrete values which the most simple scenario would be an ON (1) & OFF(0), that is either sth exsists, is correct &... or not, is false & ... .

Every computer is made up of billions of transistors, which is switched on or off in reflect to the voltage/signals it gets. There is an internal continues to discrete conversion process here: assume the voltage domain varies from ~0 V to ~5 V. Then Its been decided to every voltage in a small range at 0V's neighborhood to be considered as 0 = Off = False & The voltage values around 5V are considered 1 = On = True .

So, with the help of transistors, we have converted the continues amount of electricity/power flow to a discrete ecosystem that is much simpler for mathematical computations.

This is the mechanism & I guess a common sense in every computer engineer's underlying levels of knowledge will solve why the range that is consisted of only 2 digits,(instead of other bases, e.g. decimal, hexadecimal, ...) would cost & work better.

Write a reply...

It's the easiest way to use electricity, on and off.

The CPU is basically a large array of transistors - electronic components (like a gate) that can be turned on and off by giving them power or not. You can imagine that a string of 0s and 1s can turn off multiple gates in any order you choose, and it's this massive puzzle of electron flow that eventually creates characters on the screen as I type them. Marvelous!

Write a reply...

Join a friendly and inclusive Q&A network for coders

  • 🖥Pick the technologies you like & read great content through your feed.
  • 💬Ask a question when you want to learn more about anything.
  • 🚀Share what you know & build your portfolio.
Sign up nowLearn more

loading ...