Fortune Telling Collection - Comprehensive fortune-telling - Shannon brown (Father of Information Theory)

Shannon brown (Father of Information Theory)

Information theory is one of the basic theories in the field of modern communication, and its founder is American mathematician shannon brown. Shannon brown's contribution lies not only in the field of information theory, but also in the fields of cryptography and computer science. This article will introduce shannon brown's life and his contribution to information theory.

Shannon brown's life

Shannon brown, born in 19 16, is an American mathematician and telecommunications engineer. He put forward the basic concepts and principles of information theory in the 1940s, which laid the foundation for the development of modern communication technology. Shannon brown's achievement lies not only in his research achievements, but also in his way of thinking and methodology. He put forward the concept of information entropy, which is a measure of information and one of the core concepts of information theory. He also put forward Shannon coding and Shannon theory, which are widely used in communication field.

Shannon brown's contribution

Information entropy

Information entropy is one of the core concepts of information theory and a measure of information quantity. Shannon brown put forward the concept of information entropy in 1948. He believes that information entropy is a method to measure information uncertainty. The greater the information entropy, the greater the uncertainty of information. The calculation formula of information entropy is:

h(X)=-σp(Xi)log2p(Xi)

Where p(xi) is the probability of the occurrence of the xi event.

Shannon coding

Shannon coding is a data compression method, which can compress data to the least number of bits. Shannon brown put forward the concept of Shannon code in 1948. He thinks that shorter codes can be used to represent characters with higher frequency, thus realizing data compression. The core idea of Shannon coding is to use shorter codes to represent characters with higher frequency and longer codes to represent characters with lower frequency. This coding method can greatly reduce the storage space of data.

information theory

Shannon theory is an important theorem in information theory, which describes the upper limit of information transmission in noisy communication channels. Shannon brown put forward the concept of Shannon theory in 1948. He believes that in noisy communication channels, the data transmission rate is limited by the channel bandwidth and signal-to-noise ratio. The formula of Shannon's theory is:

C = B * log2( 1+ SNR)

Where c stands for data transmission rate, b stands for channel bandwidth, and S/N stands for signal-to-noise ratio.