Entropy




<theory> A measure of the disorder of a system.

Systems tend to go from a state of order (low entropy) to a state of maximum disorder (high entropy).

The entropy of a system is related to the amount of information it contains.

A highly ordered system can be described using fewer bits of information than a disordered one.

For example, a string containing one million "0"s can be described using run-length encoding as [("0", 1000000)] whereas a string of random symbols (e.g. bits, or characters) will be much harder, if not impossible, to compress in this way.

Shannon's formula gives the entropy H(M) of a message M in bits:

H(M) = -log2 p(M)

Where p(M) is the probability of message M.



< Previous Terms Terms Containing entropy Next Terms >
Enterprise Resource Planning
Enterprise Systems CONnectivity
EntireX
entity-relationship diagram
entity-relationship model
entropy
Entry Sequenced Data Set
enumerated type
enumeration
environment
environment variable