Entropy, Concentration of Probability and Conditional Limit Theorems

Lewis, J. T. and Pfister, C.-E. and Sullivan, W. G. (1994) Entropy, Concentration of Probability and Conditional Limit Theorems. (Preprint)

Share :
Mastodon Twitter Facebook Email

[thumbnail of DIAS-STP-94-43.pdf] Text
DIAS-STP-94-43.pdf

Download (3MB)

Abstract

We provide a framework in which a class of conditional limit theorems can be proved in an unified way. We introduce three concepts: a concentration set for a sequence of probability measures, generalizing the Weak Law of Large Numbers; conditioning with respect to a sequence of sets which satisfies a regularity condition; the asymptotic behaviour of the information gain of one sequence of probability measures with respect to another. These concepts are required for the statement of our main abstract result, Theorem 5.1, which describes the asymptotic behaviour of the information gain of a sequence of conditioned measures with respect to a sequence of tilted measures. Provided certain natural convexity assumptions are satisfied, it follows that conditional limit theorems are valid in great generality; this is the content of Theorem 6.1. We give several applications of the formalism, both for independent and weakly dependent random variables, extending in all cases previously known results. For the empirical measure, we provide a conditional limit theorem and give an alternative proof of the Large Deviation Principle. We discuss also the problem of equivalence of ensembles for lattice models in Statistical Mechanics.

Item Type: Article
Divisions: School of Theoretical Physics > Preprints
Date Deposited: 19 Jun 2018 14:11
Last Modified: 17 Dec 2022 10:08
URI: https://dair.dias.ie/id/eprint/711

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year