Answer the question
In order to leave comments, you need to log in
How does the processor recognize the instruction length?
Hello. Please help. I want to understand the essence of the processor. In Petzold's book, K.O.D. there is text:
In the computer from chapter 17, all commands (except Stop) occupied 3 bytes - the operation code and a two-byte address. In the 8080 processor, commands are 1, 2, and 3 bytes long. Some commands tell the processor to read a byte from a specific location memory, others - write a byte to a specific memory cell, others - perform some internal operations without accessing the RAM. Having completed the first command, the processor reads the next one from the memory, etc. Taken together , these commands are a computer program whose purpose is to get the processor to do something.
Questions:
1. As I understand it, the length of the command is different. Then how can the processor distinguish the instruction length (1, 2, and 3 bytes) to execute the next instruction?
2. Why, except for the Stop command, all commands take 3 bytes?
3. How to select a team? How to find the command fetch logic?
ps My Russian is bad. The text may contain spelling errors. I apologize in advance))
Answer the question
In order to leave comments, you need to log in
The easiest to understand:
for example, the two most significant bits of the command code determine its length
, respectively, the command
from 00 00000 to 00 111111 - will be single-byte
from 01 00000 to 01 111111 - will be double-byte
from 10 00000 to 10 111111 - will be three-byte
from 11 00000 to 11 111111 - will be four bytes
But this is not for 8080. Just to understand the general meaning.
And so - on the structure of the processor, the element "command decoder" apparently came across - in fact, it is his task to understand from the command code what the subsequent bytes mean.
2. The 8080 has more than one single-byte commands. Offhand: NOP, HALT, INT3. RET*, shifts, exchange between registers
3. if we talk about 8080, it’s more convenient and understandable not even logically, but in circuitry ... and so - one of the sources
The 8080 processor was 8-bit.
That is, it processed one command from one byte in one cycle, then it was already possible to understand how to process the next two bytes.
For such things there is an internal decoder.
Previously, there was just a special register (OPCODE), in which the first byte of the instruction was placed and the processor, decoding it, processed the following bytes according to this instruction
how can the processor distinguish the instruction length (1, 2, and 3 bytes) to execute the next instruction?
Why, except for the Stop command, all commands take 3 bytes?
How to select a team? How to find the command fetch logic?
The first byte indicates which command it is, or the type of command. Next comes the logic that either processes this particular one-byte command, or reads the next bytes to specify the command further. That. as soon as the processor (with the help of the command decoder) understands what the command is, it executes it - it gives current to the necessary elements in the right direction.
One of the options written above is to encode the length in separate bits. You can also use a prefix code
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question