Abstract and keywords
Abstract (English):
the article discusses the concept of information as part of the studding process, the calculation of information. Calculated quantitative metrics information.

Keywords:
information, quantity of information, information capacity, the density of the flow of information.
Text

УДК 372.8(072)

EDUCATIONAL INFORMATION FLOWS AND THEM MEASUREMENT

ОБ ИЗМЕРЕНИЯХ ПОТОКОВ УЧЕБНОЙИНФОРМАЦИИ

Урывская Т.Ю., к.ф.-м.н.,

Петшауэр М.Ю.,

Щеглеватых Н.Н.

ВУНЦ ВВС «ВВА имени профессора Н.Е. Жуковского и Ю.А. Гагарина» г. Воронеж, Россия

u_tanya2002@mail.ru

DOI: 10.12737/16007

 

Аннотация: в статье рассматривается понятие информации, как части учебного процесса, порядок расчета информации. Рассчитаныколичественныепоказателиинформации.

Summary: the article discusses the concept of information as part of the studding process, the calculation of information. Calculatedquantitativemetricsinformation.

Ключевые слова: информация, количество информации, информационная емкость, плотность потока информации.

Key words: information, quantity of information, information capacity, the density of the flow of information.

 

 

Daily, we face with new knowledge, new information, whether it's computer technology and the Internet or radio and television, etc. In technical diagnostics, especially in the construction of optimal diagnostic processes, widely used information theory. Emerged as a mathematical theory of communication in the works of Wiener and Shannon, information theory has been applied in other scientific fields as General statistical theory of communication systems.In the Diagnosis of such systems are the system conditions (diagnoses) and the associated system of signs. Central to information theory is the notion of the entropy of the system. Information entropy is a measure of uncertainty or unpredictability of the information, the uncertainty of the occurrence of any of the primary symbol of the alphabet. In the absence of information loss is numerically equal to the amount of information per symbol of the transmitted messages. For example, in the sequence of letters that make up any sentence in Russian, different letters appear with different frequency, therefore the uncertainty of occurrence for some less letters than for others. If you consider that some combinations of letters (in this case we speak about the entropy of the n-th order) are very rare, the uncertainty is reduced even more. Entropy is the amount of information per one elementary message source that produces statistically independent of the message [1].

References

1. Uryvskaya T.Yu., Strizhak E.D. Ob izmereniyakh potokov uchebnoy informatsii / Sovremennye problemy estestvoznaniya. Inzhenernyy analiz ob´´ektov obespecheniya aviatsii. Sbornik statey po materialam II Mezhvuzovskoy nauchno-prakticheskoy konferentsii kursantov i slushateley «Molodezhnye chteniya pamyati Yu.A. Gagarina»20 maya 2015 g.

2. Traynev V. A., Traynev I. V. Informatsionnye kommunikatsionnye pedagogicheskie tekhnologii. M., 2008.

3. Voronin A. M. Informatsionnye i kommunikatsionnye tekhnologii v sovremennom vysshem obrazovanii. Almaty, 2008.


Login or Create
* Forgot password?