Journal
COMPUTERS & MATHEMATICS WITH APPLICATIONS
Volume 65, Issue 10, Pages 1438-1456Publisher
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.camwa.2013.01.004
Keywords
Uncertainty; Markov process; Lyapunov function; Entropy; Maxent; Inference
Categories
Ask authors/readers for more resources
The entropy maximum approach (Maxent) was developed as a minimization of the subjective uncertainty measured by the Boltzmann-Gibbs-Shannon entropy. Many new entropies have been invented in the second half of the 20th century. Now there exists a rich choice of entropies for fitting needs. This diversity of entropies gave rise to a Maxent anarchism. The Maxent approach is now the conditional maximization of an appropriate entropy for the evaluation of the probability distribution when our information is partial and incomplete. The rich choice of non-classical entropies causes a new problem: which entropy is better for a given class of applications? We understand entropy as a measure of uncertainty which increases in Markov processes. In this work, we describe the most general ordering of the distribution space, with respect to which all continuous-time Markov processes are monotonic (the Markov order). For inference, this approach results in a set of conditionally most random distributions. Each distribution from this set is a maximizer of its own entropy. This uncertainty of uncertainty is unavoidable in the analysis of non-equilibrium systems. Surprisingly, the constructive description of this set of maximizers is possible. Two decomposition theorems for Markov processes provide a tool for this description. (C) 2013 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available