Current location - Health Preservation Learning Network - Slimming men and women - Definition of information
Definition of information
1. Information refers to the objects transmitted and processed through audio, message and communication systems, and generally refers to all the contents spread by human society. People can distinguish different things by obtaining and identifying different information from nature and society, so as to understand and transform the world. In all communication and control systems, information is a common form of communication.

2. The word "information" is information in English, French, German and Spanish, intelligence in Japanese, information in Taiwan Province province of China and news in ancient China. As a scientific term, it first appeared in the article "Information Transmission" written by Hatle in 1928. The representative expression is as follows:

Shannon, the founder of information, thinks that "information is used to eliminate random uncertainty", which is regarded as a classic definition and quoted by people.

Norbert wiener, the founder of cybernetics, thinks that "information is the content and name that people exchange with the outside world in the process of adapting to the outside world and making this adaptation react to the outside world", which is also cited as a classic definition.

Economic management experts believe that "information is effective data to provide decision-making".

Extended data:

Information has the following characteristics:

1. The greater the probability P(x) of message x, the smaller the information; Conversely, the smaller the probability of occurrence, the greater the amount of information. As you can see, the amount of information (which we refer to as I) is inversely proportional to the probability of message occurrence.

2. When the probability is 1, everyone knows what happened, so the amount of information is 0.

3. When a message consists of several independent small messages, the information contained in this message should be equal to the sum of the information contained in each small message.

According to these characteristics, if expressed by mathematical logarithmic function, the relationship between information quantity and message occurrence probability can be expressed exactly: I=-loga(P(x)). In this way, can't the information be quantified? Since information can be quantified, should we give it a unit? People's weight is measured in kilograms, and people's height is measured in meters, so what unit should be used to measure the amount of information? Usually, the amount of information is measured in bits, which is more convenient, because the amount of information in a binary waveform is exactly equal to 1bit.

References:

Baidu encyclopedia-information