Current location - Music Encyclopedia - NetEase Cloud Music - What does extremely low information entropy mean?
What does extremely low information entropy mean?

Information entropy refers to the measurement of uncertainty in information, and can also be understood as the degree of disorder of information. Therefore, information entropy is extremely low, which means that the uncertainty in the information is very small, that is, the information is very orderly and regular. In real life, we may encounter some highly regularized information, such as music melody or mathematical formulas, etc. The entropy value of this information is very low.

In the field of information technology, information entropy is also often used to evaluate the compression rate of data. Generally speaking, if the information entropy of a data set is very low, then shorter code can be used to compress it into a smaller space, and vice versa. For example, the information entropy of text data will decrease as the rules and grammar of the language adapt, so shorter codes can be used to store and transmit text data.

Low information entropy does not necessarily mean that the quality of the information itself is high. Sometimes, certain types of information may be restricted and blocked to achieve extremely low information entropy. For example, a dictatorship may restrict and censor freedom of speech so that the information received by the public is very orderly and regular, but this does not mean that the information is authentic and credible. Therefore, we need to continuously improve our ability to identify information to avoid being misled by low-entropy information.