Answer the question
In order to leave comments, you need to log in
How does a computer understand binary code?
Who can explain or throw off the link?
Answer the question
In order to leave comments, you need to log in
How does a computer understand binary code?He does not "understand" him. Binary code was invented by people solely for their convenience. Let's say a computer fell into the hands of an alien who knows nothing about computers. Let's say he has an ammeter, a voltmeter, and a clock. He can at any moment look at the values \u200b\u200bof the current strength / potential at different points of the computer, build graphs, but he cannot draw any intelligible conclusion until he assumes that, for example, a voltage from 4.8 to 5.2 volts is a logical "one", and from - 5.2 to -4.8 volts - logical "zero". From this assumption, it is already possible to draw conclusions about what exactly the computer does. Without this assumption, without trying to impose restrictions on signals - no.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question