Scheme for transmitting information over a communication channel. Scheme of information transmission via cellular communication. What is information

| 8th grade | Planning lessons for the school year | Working in a local network of a computer class in file sharing mode

Lesson 2
Working in a local network of a computer class in file sharing mode

Transfer of information via technical communication channels

Transfer of information via technical communication channels

Shannon scheme

The American scientist, one of the founders of information theory, Claude Shannon proposed a diagram of the process of transmitting information through technical communication channels (Fig. 1.3).

Rice. 1.3. Diagram of a technical information transmission system

The operation of such a scheme can be explained using the familiar process of talking on the phone. A source of information- a talking person. Encoder- a telephone handset microphone, with the help of which sound waves (speech) are converted into electrical signals. Communication channel - telephone network (wires, switches of telephone nodes through which the signal passes). Decoder- a telephone handset (earphone) of a listening person - a receiver of information. Here the incoming electrical signal is converted into sound.

Here, information is transmitted in the form of a continuous electrical signal. This analog communication.

Encoding and decoding of information

Under coding refers to any transformation of information coming from a source into a form suitable for its transmission over a communication channel.

At the dawn of the radio era, the alphabet code was used Morse. The text was converted into a sequence of dots and dashes (short and long signals) and broadcast. The person who received such a transmission by ear had to be able to decode the code back into text. Even earlier, Morse code was used in telegraph communications. Transmitting information using Morse code is an example of discrete communication.

Currently, digital communications are widely used, when the transmitted information is encoded in binary form (0 and 1 are binary digits), and then decoded into text, image, sound. Digital communication is obviously also discrete.

Noise and noise protection. Shannon coding theory

Information via communication channels is transmitted using signals of various physical natures: electrical, electromagnetic, light, acoustic. The information content of a signal lies in the value or change in the value of its physical quantity (current strength, light brightness, etc.). The term "noise" refers to various types of interference that distort the transmitted signal and lead to loss of information. Such interference primarily occurs due to technical reasons: poor quality of communication lines, lack of protection from each other of different streams of information transmitted over the same channels. Often, when talking on the phone, we hear noise, crackling noises that make it difficult to understand the interlocutor, or our conversation is superimposed on the conversation of other people. In such cases, noise protection is necessary.

First of all they apply technical methods of protecting communication channels from exposure to noise. Such methods can be very different, sometimes simple, sometimes very complex. For example, using shielded cable instead of bare wire; the use of various types of filters that separate the useful signal from noise, etc.

K. Shannon developed a special coding theory, giving methods to combat noise. One of the important ideas of this theory is that the code transmitted over the communication line must be redundant. Due to this, the loss of some part of the information during transmission can be compensated. For example, if you are hard of hearing when talking on the phone, then by repeating each word twice, you have a better chance that the other person will understand you correctly.

However, you cannot do redundancy too big. This will lead to delays and higher communication costs. Shannon's coding theory allows us to obtain a code that will be optimal. In this case, the redundancy of the transmitted information will be the minimum possible, and the reliability of the received information will be maximum.

IN modern systems In digital communications, the following technique is often used to combat the loss of information during transmission. The entire message is divided into portions - packets. For each packet, a checksum (the sum of binary digits) is calculated and transmitted along with the packet. At the receiving site, the checksum of the received packet is recalculated, and if it does not coincide with the original one, then the transmission of this packet is repeated. This happens until the source and destination checksums match.

Briefly about the main thing

Any technical system transfer of information consists of a source, receiver, encoding and decoding devices and a communication channel.

Under coding refers to the transformation of information coming from a source into a form suitable for its transmission over a communication channel. Decoding is the inverse transformation.

Noise- This is interference leading to loss of information.

Coding theory has been developed methods presentation of transmitted information in order to reduce its losses under the influence of noise.

Questions and tasks

1. Name the main elements of the information transfer scheme proposed by K. Shannon.

2. What is encoding and decoding when transmitting information?

3. What is noise? What are its consequences in the transmission of information?

4. What are some ways to deal with noise?

EC TsOR: Part 2, conclusion, addition to chapter 1, § 1.1. TsOR No. 1.

Previously, the source of information was defined as an object or subject that generates information and has the ability to present it in the form of a message, i.e. sequences of signals in a material medium. In other words, the source connects information with its material carrier. The transmission of a message from a source to a receiver is always associated with some non-stationary process occurring in the material environment - this condition is mandatory, since the information itself is not a material object or a form of existence of matter. There are many ways to transmit information: mail, telephone, radio, television, computer networks, etc. However, with all the variety of specific implementations of communication methods, common elements can be identified in them, presented in Fig. 5.1.

You need to understand the diagram as follows. The source generating information must present it in the form of a message for transmission, i.e. sequences of signals. At the same time, he must use some coding system to represent information. The device that performs the operation of encoding information may be a subsystem of the source (for example, our brain generates information and it encodes this information using language, and then presents it in the form of a speech message through the speech organs; the computer processes and stores information in binary representation, but displaying it on the monitor screen recodes it to a form convenient for the user).

A situation is possible when the encoding device is external in relation to the source of information, for example, a telegraph machine or computer in relation to the operator working on it. Next, the codes must be translated into a sequence of material signals, i.e. placed on a physical medium - this operation is performed by the converter. The converter can be combined with an encoding device (for example, a telegraph machine), but it can also be an independent element communication lines (for example, a modem that converts discrete electrical signals with a computer frequency into analog signals with the frequency at which their attenuation in telephone lines will be minimal). Transducers also include devices that transfer a message from one medium to another, for example, a megaphone or telephone that converts voice signals into electrical signals; radio transmitter that converts voice signals into radio waves; a television camera that converts images into a sequence of electrical impulses. In the general case, during conversion, the output signals do not completely reproduce all the features of the input message, but only its essential aspects, i.e. During conversion, some information is lost. For example, the frequency bandwidth for telephone communications is from 300 to 3400 Hz, while the frequencies perceived by the human ear lie in the range of 16 - 20000 Hz (i.e. telephone lines are “cut” high frequencies, which leads to sound distortion); In black and white television, the color of the image is lost during conversion. It is in this regard that the task arises of developing a message encoding method that would provide the most complete representation of the original information during conversion and, at the same time, be consistent with the speed of information transmission over a given communication line.

After the converter, the signals arrive and propagate through communication channel. The concept of a communication channel includes material environment, and Also physical or other process, through which the message is transmitted, i.e. propagation of signals in space over time. Below are examples of some communication channels.

After the message passes through the communication channel, the signals are converted using a receiving converter into a sequence of codes, which are presented by the decoding device in the form required by the information receiver. At the reception stage, as well as during transmission, the converter can be combined with a decoding device (for example, a radio or TV) or exist independently (for example, a modem).

Concept communication line combines all the elements presented in Fig. 5.1. circuits from the source to the receiver of information. The characteristics of any communication line are the speed with which a message can be transmitted over it, as well as the degree of message distortion during transmission. From these parameters, we isolate those that relate directly to the communication channel, i.e. characterize the environment and the transmission process.

  • 2. Addition of probabilities of independent incompatible events
  • 3. Multiplication of probabilities of independent joint events
  • 4. Finding the average for the values ​​of random independent variables
  • 5. The concept of conditional probability
  • 6. General formula for the probability of events occurring
  • 7. General formula for the probability of the sum of events
  • Lecture 3. The concept of entropy
  • 1. Entropy as a measure of uncertainty
  • 2. Properties of entropy
  • 3. Conditional entropy
  • Lecture 4. Entropy and information
  • 1. Volumetric approach to measuring the amount of information
  • 2. Entropy approach to measuring the amount of information
  • Lecture 5. Information and alphabet
  • Lecture 6. Statement of the coding problem. Shannon's first theorem.
  • Lecture 7. Methods for constructing binary codes. Alphabetical non-uniform binary coding with signals of equal duration. Prefix codes.
  • 1. Statement of the problem of optimization of non-uniform coding
  • 2. Uneven code with delimiter
  • 3. Codes without separator. Fano condition
  • 4. Shannon–Fano prefix code
  • 5. Huffman prefix code
  • Lecture 8. Methods for constructing binary codes. Other options
  • 1. Uniform alphabetic binary coding. Byte code
  • 2. International byte coding systems for text data. Universal text data encoding system
  • 3. Alphabetic coding with unequal duration of elementary signals. Morse code
  • 4. Block binary coding
  • 5. Graphic data encoding
  • 6. Encoding of audio information
  • Lecture 9. Number systems. Representation of numbers in different number systems. Part 1
  • 1. Number systems
  • 2. Decimal number system
  • 3. Binary number system
  • 4. 8- and 16-ary number systems
  • 5. Mixed number systems
  • 6. The concept of economy of a number system
  • Lecture 10. Number systems. Representation of numbers in different number systems. Part 2.
  • 1. The task of converting a number from one number system to another
  • 2. Converting q  p integers
  • 3. Converting p  q integers
  • 4. Converting p  q fractions
  • 6. Converting numbers between 2-digit, 8-digit and hexadecimal number systems
  • Lecture 11. Coding numbers in a computer and operations on them
  • 1. Normalized numbers
  • 2. Converting a number from its natural form to its normalized form
  • 3. Converting normalized numbers
  • 4. Encoding and processing unsigned integers
  • 5. Encoding and processing signed integers
  • 6. Coding and processing of real numbers
  • Lecture 12. Transmitting information over communication lines
  • 1. General scheme for transmitting information in a communication line
  • 2. Communication channel characteristics
  • 3. Effect of noise on channel capacity
  • Lecture 13. Ensuring the reliability of information transfer.
  • 1. Statement of the problem of ensuring transmission reliability
  • 2. Codes that detect a single error
  • 3. Codes that correct a single error
  • Lecture 14. Methods of transmitting information in computer communication lines
  • 1. Parallel data transfer
  • 2. Serial data transmission
  • 3. Communication of computers via telephone lines
  • Lecture 15. Data classification. Representation of data in computer memory
  • 1. Data classification
  • 2. Representation of elementary data in RAM
  • Lecture 16. Classification of data structures
  • 1. Classification and examples of data structures
  • 2. The concept of logical notation
  • Lecture 17. Organization of data structures in RAM and on external media
  • 1. Organization of data structures in RAM
  • 2. Hierarchy of data structures on external media
  • 3. Features of storage devices
  • Control questions
  • Bibliography
  • Lecture 12. Transmitting information over communication lines

      General scheme for transmitting information in a communication line

      Communication channel characteristics

      The influence of noise on channel capacity

    1. General scheme for transmitting information in a communication line

    The use of information to solve any problems is, of course, associated with the need for its dissemination, that is, with the need to carry out processes of transmitting and receiving information. In this case, it is necessary to solve the problem of matching the encoding method with the characteristics of the communication channel, as well as to ensure the protection of the transmitted information from possible distortions.

    The source of information is defined as an object or subject that generates information and has the ability to present it in the form of a message, that is, a sequence of signals in a material medium. In other words, the source of information connects information with its material carrier. The transmission of a message from source to receiver is always associated with some non-stationary process occurring in the material environment – this condition is mandatory, since the information itself is not a material object.

    There are many ways to transmit information: mail, telephone, radio, television, computer networks, etc. However, with all the variety of specific implementations of communication methods, common elements can be identified in them: the source and recipient of information, encoding and decoding devices, code converter into signals and converter signals into codes, a communication channel, as well as sources of noise (interference) and factors that provide protection from noise (see diagram in Fig. 4).

    You need to understand the diagram as follows. Source , generating information, for transmission must present it in the form of a message, that is, a sequence of signals. At the same time, he must use some coding system to present information. The device that performs the encoding operation information may be a subsystem of the information source. For example, our brain generates information and it also encodes this information using language (for example, Russian), and then presents the information in the form of a speech message through the speech organs. The computer processes and stores information in binary representation, but when it is displayed on the monitor screen, it is also the computer – it recodes it for the user.

    A situation is possible when the encoding device turns out to be external in relation to the source of information, for example, a telegraph machine or computer in relation to the person - the operator working on it. Further codes must be translated into a sequence of material signals, that is, placed on a tangible medium - this operation is performed byconverter . The converter may be combined with encoding device(for example, a telegraph machine), but it may also be an independent element communication lines (for example, a modem that converts discrete electrical signals with a computer frequency into analog signals with a frequency at which their attenuation in telephone lines will be minimal).

    Converters also include devices that transfer a message from one medium to another. For example:

      telephone device that converts sound signals in electrical;

      a radio transmitter that converts sound signals into radio waves;

      a television camera that converts images into a sequence of electrical impulses.

    Rice. 4. General information transfer scheme

    In the general case, during conversion, output signals do not completely reproduce all the features of the input message, but only its most essential aspects, that is, during conversion, part of the information is lost. For example, the frequency bandwidth of telephone communications ranges from 300 to 3400 Hz, while the frequencies perceived by the human ear range from 16 to 20,000 Hz.

    Thus, telephone lines “cut off” high frequencies, which leads to sound distortion; In black and white television, when the message is converted into signals, the color of the image is lost. It is in connection with these problems that the task arises of developing a message encoding method that would provide the most complete representation of the original information during conversion, and, at the same time, this method would be consistent with the speed of information transmission over a given communication line.

    After the converter, the signals enter link and spread in it. The concept of a communication channel includes material environment , as well as physical or other process , through which the message is transmitted, that is, the propagation of signals in space over time.

    In table Figure 20 shows examples of some communication channels.

    Table 20. Examples of communication channels

    Link

    Wednesday

    Message carrier

    The process used to transmit the message

    Human habitat

    Mechanical media movement

    Telephone, computer networks

    Conductor

    Electric charges

    Movement of charges (current)

    Radio, television

    Electromagnetic

    Electromagnetic

    Propagation of electromagnetic waves

    Sound waves

    Propagation of sound waves

    Smell, taste

    Air, food

    Chemical substances

    Chemical reactions

    Touch

    Skin surface

    Object affecting the skin

    Heat transfer, pressure

    Any real The communication channel is subject to external influences, and internal processes can also occur in it, as a result of which the transmitted signals are distorted, and, consequently, the messages associated with these signals. Such influences are called noise (interference). Sources of interference may be external And internal . TO external interference includes, for example, so-called “noise” from powerful consumers of electricity or atmospheric phenomena; simultaneous action of several nearby similar sources of messages (simultaneous conversation of several people). Interference can also be caused by internal features of a given communication channel, for example, physical inhomogeneities of the carrier; processes of signal attenuation in a communication line, which are significant when the receiver is far from the source.

    If the level of interference turns out to be commensurate with the power of the signal carrying information, then the transmission of information through this channel becomes impossible. Even relatively low levels of noise can cause significant distortion of the transmitted signal.

    There are and are used various anti-interference methods . For example, shielding of electrical communication lines is used; improving the selectivity of the receiving device, and so on. Another way to protect against interference is to use special information encoding methods.

    After the message passes through the communication channel, signals using receiving converter are translated into a sequence of codes that decoding device are presented in the form necessary for the information receiver (in a form perceived by the receiver). At the reception stage, as well as during transmission, the converter can be combined with a decoding device (for example, a radio or TV) or exist separately from the decoding device (the modem converter can exist separately from the computer).

    The concept " communication line » combines the elements presented in Fig. 1 circuit between the source and receiver of information. Characteristics of any line connections are speed , with which it is possible to transmit a message in it, as well as degree of distortion messages in progress.

    General scheme for transmitting information in a communication line

    Previously, the source of information was defined as an object or subject that generates information and has the ability to present it in the form of a message, i.e. sequences of signals in a material medium. In other words, the source connects information with its material carrier. The transmission of a message from source to receiver is always associated with some non-stationary process occurring in the material environment. This condition is mandatory, since information itself is not a material object or a form of existence of matter. There are many ways to transmit information: mail, telephone, radio, television, computer networks, etc. However, with all the variety of specific implementations of communication methods, it is possible to identify common elements in them, presented in the diagram (Fig. 9).

    A situation is possible when the encoding device is external in relation to the source of information, for example, a telegraph machine or computer in relation to the operator working on it. Next, the codes must be translated into a sequence of material signals, that is, placed on a material medium - this operation is performed by a converter. The converter can be combined with an encoding device (for example, a telegraph machine),

    Rice. 9.

    communication can also be an independent element of a communication line (for example, a modem that converts discrete electrical signals with a computer frequency into analog signals with a frequency at which their attenuation in telephone lines will be minimal). Transducers also include devices that transfer a message from one medium to another, for example, a megaphone or telephone that converts voice signals into electrical signals; a radio transmitter that converts voice signals into radio waves; a television camera that converts images into a sequence of electrical impulses. In the general case, during conversion, output signals do not completely reproduce all the features of the input message, but only its essential aspects, i.e., during conversion, part of the information is lost. For example, the frequency bandwidth for telephone communications is from 300 to 3400 Hz, while the frequencies perceived by the human ear lie in the range - 16-20,000 Hz (i.e., telephone lines “cut off” high frequencies, which leads to distortion sound); In black and white television, the color of the image is lost during conversion. It is in this connection that the task arises of developing a message encoding method that would provide the most complete representation of the original information during conversion and at the same time would be consistent with the speed of information transmission over a given communication line.

    After the converter, the signals arrive and propagate through communication channel. The concept of “communication channel” includes material environment, and physical or other process, through which a message is transmitted, i.e., the propagation of signals in space over time. Table 10 shows examples of some communication channels.

    Any real communication channel is subject to external influences; internal processes can also occur in it, as a result of which the transmitted signals and, consequently, the message associated with them are distorted. Such influences are called noise (interference). Sources of interference may be external,

    Channels of connection

    Table 10

    Link

    Wednesday

    Message carrier

    Process used to pass messages

    Mail, couriers

    Human habitat

    Mechanical media movement

    Telephone, computer networks

    Conductor

    Electricity

    Movement of electric charges

    Radio, television

    Electromagnetic

    Electromagnetic

    Propagation of electromagnetic waves

    Light waves

    Propagation of Light Waves

    Sound waves

    Propagation of sound waves

    Smell, taste

    Air, food

    Chemical substances

    Chemical reactions

    Touch

    Skin surface

    Object affecting the senses of touch

    Heat transfer, pressure

    After the message passes through the communication channel, the signals are converted using a receiving converter into a sequence of codes, which are presented by the decoding device in the form required by the information receiver. At the reception stage, as well as during transmission, the converter can be combined with a decoding device (for example, a radio or TV) or exist independently (for example, a modem).

    The concept of “communication line” unites all the elements presented in the diagram - from the source to the receiver of information. The characteristics of any communication line are the speed with which a message can be transmitted over it, as well as the degree of message distortion during transmission. From these parameters, we isolate those that relate directly to the communication channel, i.e., characterize the environment and the transmission process.

    Communication channel characteristics

    Next, we will consider communication channels through which messages are transmitted using electrical impulses. From a practical point of view, as well as for computer communication lines, these channels are of the greatest interest.

    Bandwidth

    Any converter whose operation is based on the use of oscillations (electrical or mechanical) can generate and transmit signals from a limited frequency range. (An example with telephone communications was given above.) The same should be applied to radio and television communications: the entire frequency spectrum is divided into ranges (LW, MW, KBI, KVP, VHF, DM V), within which each station occupies its own subband, so that do not interfere with the broadcast of others.

    The frequency range used by a given communication channel to transmit signals is called bandwidth.

    To build a theory, it is not the bandwidth itself that is important, but the maximum frequency value from a given band (v m), since it is this that determines possible speed transmission of information over the channel.

    The duration of an elementary pulse can be determined from the following considerations. If the signal parameter changes sinusoidally, then, as can be seen from the figure, in one oscillation period T the signal will have one maximum value and one minimum value.

    Rice. 10.

    If we approximate a sinusoid rectangular pulses and shift the reference point to the level of the minimum value, it turns out that the signal takes only two values: the maximum (let’s denote it "1")- pulse, minimum (can be designated "ABOUT")- pause. An impulse and a pause can be considered elementary signals; with the chosen approximation, their durations are obviously the same and equal:

    If the pulses are generated by a clock generator having a frequency vm, That

    Thus, every 0 seconds you can transmit a pulse or a pause, associating certain codes with their sequence. It is, in principle, possible to use signals of longer duration than t 0 (for example, 2 t 0) - this will not lead to loss of information, although it will reduce the speed of its transmission over the channel. The use of signals shorter than t 0 can lead to information losses, since the signals will then take some intermediate values ​​between the minimum and maximum, which will complicate their interpretation.

    Thus, v m determines the duration of an elementary signal t 0, used to convey a message.

    Communication channel capacity

    If the transmission of one pulse is associated with the amount of information 1. tr, and it is transmitted in time t 0, ratio I to t 0, obviously, will reflect the average amount of information transmitted over the channel per unit of time - this value is a characteristic of the communication channel and is called channel capacity C:

    If G tr expressed in bits, and t 0 - in seconds, then the unit of measure C will be bps Previously, such a unit was called baud, but the name did not stick, and for this reason the throughput of a communication channel is measured in bits/s. Derived units are:

    • 1 Kbit/s = 10 3 bit/s,
    • 1 Mbit/s = 10 6 bit/s,
    • 1 Gbit/s = 10 9 bit/s.

    Information transfer rate

    Let through the communication channel in time t amount of information transmitted I. You can introduce a value characterizing the speed of information transfer - information transfer speed J:

    Dimension J, like C, is bit/s. What is the relationship between these characteristics? Since m 0 is the minimum duration of an elementary signal, it is obvious that C corresponds to the maximum speed of information transmission over a given communication line, i.e. J J max Thus, maximum speed transmission of information over a communication channel is equal to its throughput.

    Entropy and information

    Random events can be described using the concept of "probability". The relations of probability theory make it possible to find (calculate) the probabilities of both single random events and complex experiments that combine several independent or interconnected events. However, random events can be described not only in terms of probabilities.

    The fact that an event is random means that there is no complete certainty of its occurrence, which, in turn, creates uncertainty in the outcomes of experiments associated with this event. Of course, the degree of uncertainty varies for different situations. For example, if the experiment consists of determining the age of a randomly selected 1st year full-time student at a university, then with a high degree of confidence we can say that he will be less than 30 years old; Although, according to the situation, persons under the age of 35 can study full-time, most often graduates of schools of the next few classes study full-time. A similar experience has much less certainty if it is checked whether the age of a randomly selected student will be less than 18 years old. It is important for practice to be able to make a numerical estimate of uncertainty different experiences. Let's try to introduce such a quantitative measure of uncertainty.

    Let's start with a simple situation where experience has P equally probable outcomes. Obviously, the uncertainty of each of them depends on P, i.e. the measure of uncertainty is a function of the number of outcomes f(n).

    You can specify some properties this function:

    • 1. f(l)= 0, because at n = 1 the outcome of the experiment is not random and, therefore, there is no uncertainty;
    • 2. f(n) increases with growth P, because the greater the number of possible outcomes, the more difficult it becomes to predict the outcome of an experiment.

    Unit of measurement of uncertainty for two possible equally probable

    The outcome of the experiment is called a bit.

    An explicit form of the function has been established that describes the measure of uncertainty of the experiment, which has P equally probable outcomes:

    This quantity is called entropy. IN in what follows we will denote it N.Statement.Entropy is equal to the information about experience that is contained within it.

    You can clarify:

    The entropy of an experience is equal to the information that we receive as a result of its implementation.

    Information properties:

    • 1. /(a,P) > 0, and /(a,|3) = 0 the experiments are independent.
    • 2. /(a,p) = /(P,a), i.e., the information is symmetrical with respect to the sequence of experiments.

    3. 5 i.e. the information of an experience is equal to the average value of the amount of information contained in any one of its outcomes.

    Easy to get corollary of the formula for the case when

    All P outcomes are equally likely. In this case, everything and therefore

    This formula was derived in 1928 by an American engineer R. Hartley and bears his name. It relates the number of equally probable states (P) and the amount of information in the message (/) that any of these states has occurred. Its meaning is that if some set contains P elements and x belongs to this set, then to isolate it (unambiguous identification) among others, an amount of information equal to log 2 “ is required.

    A special case of application of Hartley’s formula is the situation when P= 2 k. Substituting this value into Hartley’s formula, we obviously get:

    Shannon's formula

    Known probabilities , with which the system assumes one of its states

    - Shannon's formula - entropy of the system - formula for measuring the amount of information.

    Properties of entropy

    2. . (Hartley formula)

    This is a case of maximum entropy.

    Shannon's first theorem.

    In the absence of interference, it is always possible to encode a message in such a way that the code redundancy will be arbitrarily close to zero.

    Shannon's second theorem.

    When transmitting information over a noisy channel, there is always a coding method in which the message will be transmitted with arbitrarily high reliability if the transmission speed does not exceed the channel capacity.

    The specifics of different areas of application of information transmission systems require a different approach to the implementation of such systems. The transmission system via telephone communication channels, for example, is completely different from the space or tropospheric communication system, either in technical design or in parameters. However, there is much in common in the principles of construction and the purpose of individual devices of various systems. In the general case, the diagram of the information transmission system is shown in Fig. 2.

    It is possible to transmit messages of a wide variety of physical nature: digital data received from a computer, speech, telegram texts, control commands, measurement results of various physical quantities. Naturally, all these messages must first be converted into electrical oscillations that retain all the properties of the original messages, and then unified, i.e., presented in a form convenient

    for subsequent transmission. Below the source of information in Fig. 2 is understood as a device in which all the operations we mentioned earlier are performed.

    For more economical use of the communication line, as well as to reduce the influence of various interferences and distortions, the information transmitted from the source can be further converted using an encoding device.

    Rice. 2. Block diagram of information transfer.

    This transformation, as a rule, consists of a number of operations, including taking into account the statistics of incoming information to eliminate redundancy (statistical coding), as well as the introduction of additional elements to reduce the influence of interference and distortion (noise-resistant coding).

    As a result of a series of transformations, a sequence of elements is formed at the output of the encoding device, which, with the help of a transmitter, is converted into a form convenient for transmission over a communication line. A communication line is a medium through which signals are transmitted from a transmitter to a receiver. Taking into account the influence of the environment is necessary. In the theory of information transmission, the concept of “communication channel” is often encountered - this is a set of means that ensure the transmission of signals.

    In addition to signals that have passed through the medium, the receiver input also receives various noises. The receiver selects a sequence from the mixture of signal and noise that must correspond to the sequence at the output of the encoder. However, due to interference, environmental influences, and errors in various transformations, complete compliance cannot be obtained. Therefore, such a sequence is entered into a decoding device, which performs operations to convert it into a sequence corresponding to the transmitted one. The completeness of this correspondence depends on a number of factors: the correcting capabilities of the encoded sequence, the level of signal and interference, as well as their statistics, and the properties of the decoding device. The sequence generated as a result of decoding reaches the recipient of the information. Naturally, when designing information transmission systems, they always strive to ensure such operating conditions that the difference between the information received from the source and the information transmitted to the recipient is small and does not exceed a certain permissible value. In this case, the main indicator of transmission quality is the reliability of information transmission - the degree of correspondence of the received message to the transmitted one.

    koreada.ru - About cars - Information portal