To use all functions of this page, please activate cookies in your browser.

my.chemeurope.com

With an accout for my.chemeurope.com you can always see everything at a glance – and you can configure your own website and individual newsletter.

- My watch list
- My saved searches
- My saved topics
- My newsletter

## Maximum entropy probability distributionIn statistics and information theory, a If nothing is known about a distribution except that it belongs to a certain class, then the maximum entropy distribution for that class is often chosen as a default, according to the principle of maximum entropy. The motivation is twofold: first, maximizing entropy, in a sense, means minimizing the amount of prior information built into the distribution; second, many physical systems tend to move towards maximal entropy configurations over time. ## Additional recommended knowledge
## Definition of entropyIf then the entropy of If where The base of the logarithm is not important as long as the same one is used consistently: change of base merely results in a rescaling of the entropy. Information theoreticians may prefer to use base 2 in order to express the entropy in bits; mathematicians and physicists will often prefer the natural logarithm, resulting in a unit of nats or nepers for the entropy. ## Examples of maximum entropy distributions## Given mean and standard deviation: the normal distributionThe normal distribution N(μ,σ ## Uniform and piecewise uniform distributionsThe uniform distribution on the interval [ More generally, if we're given a subdivision The density of the maximum entropy distribution for this class is constant on each of the intervals [ The uniform distribution on the finite set { ## Positive and given mean: the exponential distributionThe exponential distribution with mean 1/λ is the maximum entropy distribution among all continuous distributions supported in [0,∞) that have a mean of 1/λ. In physics, this occurs when gravity acts on a gas that is kept at constant pressure and temperature: if ## Discrete distributions with given meanAmong all the discrete distributions supported on the set { where the positive constants For example, if a large number Finally, among all the discrete distributions supported on the infinite set { where again the constants ## A theorem by BoltzmannAll the above examples are consequences of the following theorem by Boltzmann. ## Continuous versionSuppose If there is a member in where the constants Conversely, if constants This theorem is proved with the calculus of variations and Lagrange multipliers. ## Discrete versionSuppose If there exists a member of where the constants Conversely, if constants This version of the theorem can be proved with the tools of ordinary calculus and Lagrange multipliers. ## CaveatsNote that not all classes of distributions contain a maximum entropy distribution. It is possible that a class contain distributions of arbitrarily large entropy (e.g. the class of all continuous distributions on It is also possible that the expected value restrictions for the class ## References- T. M. Cover and J. A. Thomas,
*Elements of Information Theory*, 1991. Chapter 11.
Categories: Entropy and information | Particle statistics |
||||||||||||||||

This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Maximum_entropy_probability_distribution". A list of authors is available in Wikipedia. |