Answer the question
In order to leave comments, you need to log in
How does a computer (processor) distinguish the bit sequence 0000 from 00000?
I would like to know more and the literature, where this issue is described in detail.
Answer the question
In order to leave comments, you need to log in
No, because both the first and second are zeros. When 0 is written, in fact 8/16/32/64 zero bits are implied (the specific number depends on the platform and context).
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question