This Wikipedia entry is what I was exposed to at university;
it was called Communication Theory at that time. The so-
called “other” entropies still baffle me. I call it the
“candy-store” entropy and have been utterly brainwashed and
hung out to dry, on its “truth”.
Communications require a transmitter, a channel, and a receiver. This theory allows electrical engineers to
describe the information content of alphabetic characters
and words so that optimal(but noisy) channels can be
e, P, p, E is the alphabet, the atoms and composite
particles are the words, the radiations are the noise,
and the channel is space(time).
To me, the analogy is fascinating; so I think/speculate
that most, if not all, models of particles and their interactions can be modeled not necessarily with Newton’s
old-style Calculus, but with the more modern Shannon
Information(communication) Theory concepts instead.
But it is important to know that I view this as merely an
hypothesis; the theory comes to me courtesy of Boltzmann, Gibbs, et. al., and finally Shannon.
Entropy is an extensive property and its paths are real
and reversible and conserved and symmetric(usually).
Thank you, cough, cough.