Entropy minimization, convergence, and Gibbs ensembles (local and global)

No Thumbnail Available

Meeting name

Sponsors

Date

Journal Title

Format

Thesis

Subject

Research Projects

Organizational Units

Journal Issue

Abstract

We approach the subject of Statistical Mechanics from two different perspectives. In Part I we adopt the approach of Lanford and Martin-Lof. We examine the minimization of information entropy for measures on the phase space of bounded domains, subject to constraints that are averages of grand canonical distributions. We describe the set of all such constraints and show that it equals the set of averages of all probability measures absolutely continuous with respect to the standard measure on the phase space. We also investigate how the set of constrains relates to the domain of the microcanonical thermodynamic limit entropy. We then show that, for fixed constraints, the parameters of the corresponding grand canonical distribution converge, as volume increases, to the corresponding parameters (derivatives, when they exist) of the thermodynamic limit entropy. In Part II, we use the Banach manifold structure on the space of finite positive measures to show that the critical points of the Gibbs entropy are grand canonical equilibria when the constraints are scalar, and local equilibria when the constraints are integrable functions. This provides a rigorous justification of the derivation of the Gibbs measures that appears often in literature.

Table of Contents

DOI

PubMed ID

Degree

Ph. D.

Thesis Department

Rights

License