Behavioral Foundations for Conditional Markov Models of Aggregate Data
Abstract
Conditional Markov chain models of observed aggregate share-type data have been
used by economic researchers for several years, but the classes of models commonly
used in practice are often criticized as being purely ad hoc because they are not derived from micro-behavioral foundations. The primary purpose of this paper is to show that the estimating equations commonly used to estimate these conditional Markov chain
models may be derived from the assumed statistical properties of an agent-specific discrete decision process. Thus, any conditional Markov chain model estimated from these estimating equations may be compatible with some underlying agent-specific decision process. The secondary purpose of this paper is to use an information theoretic approach to derive a new class of conditional Markov chain models from this set of estimating equations. The proposed modeling framework is based on the behavioral foundations but does not require specific assumptions about the utility function or other components of the agent-specific discrete decision process. The asymptotic properties of the proposed estimators are developed to facilitate model selection procedures and classical tests of behavioral hypotheses.
Part of
Citation
Department of Economics, 2007
Rights
OpenAccess.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License.