Answer the question
In order to leave comments, you need to log in
If there were 3 voltage levels in a computer, would there be a formula for information?
The textbook has the formula D = log2(logarithm power)N bits, where D is the amount of information, N is the number of different states. I didn’t understand at all where the degree of 2 of the logarithm came from, through the search engine I found proof through entropy that in order for a sign to become information, it is necessary to ask it a question, the answer to which will give 1 bit of information, and the question itself will have two conclusions: yes and no . But if we had 3 states, then the formulawith 1 sign, it would give out 2 bits (we round up, because in order to encode 3 states with two values, at least 2 bits are needed: 00, 01, 10), although in fact our question already contains not two answers, but three, thus with records of one character, we must also receive 1 bit of information. Isn't this formula currently a special case for representing information through a binary system, when, as on the Internet, it is presented as a general information formula?
https://medium.com/nuances-of-programming/introduction...
Answer the question
In order to leave comments, you need to log in
Everything here is tied to the fact that a bit is 2 states. Therefore, the logarithm is binary. If a bit had 3 states, then it would not be 2 binary bits - you can’t round like that (let’s round your rent up to 100,000, are you okay?). The logarithms would be base 3, because each new tribit would multiply the number of possible states by 3.
This formula, as a measure, does not imply a specific value, but describes its order / degree for evaluating dependence,
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question