U
U
ulilu13722017-03-01 09:32:14
Electronics
ulilu1372, 2017-03-01 09:32:14

How does a computer (processor) distinguish the bit sequence 0000 from 00000?

I would like to know more and the literature, where this issue is described in detail.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
V
VoidVolker, 2017-03-01
@ulilu1372

No, because both the first and second are zeros. When 0 is written, in fact 8/16/32/64 zero bits are implied (the specific number depends on the platform and context).

A
aol-nnov, 2017-03-01
@aol-nnov

read about coding information.

  • not necessarily all your zeros will be encoded by low levels (self-synchronizing codes)
  • there are synchronization signals

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question