WHAT IS INFORMATION THEORY?

From Wikipedia:

Information theory studies the quantificationstorage, and communication of information. It was originally proposed byClaude E. Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication". Now this theory has found applications in many other areas, including statistical inferencenatural language processingcryptography,neurobiology,[1] the evolution[2] and function[3] of molecular codes, model selection in ecology,[4] thermal physics,[5]quantum computinglinguisticsplagiarism detection,[6] pattern recognition, and anomaly detection.[7]

A key measure in information theory is "entropy". Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process