Shared more. Cited more. Safe forever.
    • advanced search
    • submit works
    • about
    • help
    • contact us
    • login
    View Item 
    •   MOspace Home
    • University of Missouri-Columbia
    • College of Arts and Sciences (MU)
    • Department of Economics (MU)
    • Economics publications (MU)
    • View Item
    •   MOspace Home
    • University of Missouri-Columbia
    • College of Arts and Sciences (MU)
    • Department of Economics (MU)
    • Economics publications (MU)
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.
    advanced searchsubmit worksabouthelpcontact us

    Browse

    All of MOspaceCommunities & CollectionsDate IssuedAuthor/ContributorTitleIdentifierThesis DepartmentThesis AdvisorThesis SemesterThis CollectionDate IssuedAuthor/ContributorTitleIdentifierThesis DepartmentThesis AdvisorThesis Semester

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular AuthorsStatistics by Referrer

    Behavioral Foundations for Conditional Markov Models of Aggregate Data

    Miller, Douglas J.
    View/Open
    [PDF] BehavioralFoundationsConditionalMarkovModels.pdf (151.3Kb)
    Date
    2007
    Format
    Working Paper
    Metadata
    [+] Show full item record
    Abstract
    Conditional Markov chain models of observed aggregate share-type data have been used by economic researchers for several years, but the classes of models commonly used in practice are often criticized as being purely ad hoc because they are not derived from micro-behavioral foundations. The primary purpose of this paper is to show that the estimating equations commonly used to estimate these conditional Markov chain models may be derived from the assumed statistical properties of an agent-specific discrete decision process. Thus, any conditional Markov chain model estimated from these estimating equations may be compatible with some underlying agent-specific decision process. The secondary purpose of this paper is to use an information theoretic approach to derive a new class of conditional Markov chain models from this set of estimating equations. The proposed modeling framework is based on the behavioral foundations but does not require specific assumptions about the utility function or other components of the agent-specific discrete decision process. The asymptotic properties of the proposed estimators are developed to facilitate model selection procedures and classical tests of behavioral hypotheses.
    URI
    http://hdl.handle.net/10355/2558
    Part of
    Working papers (Department of Economics);WP 07-18
    Part of
    Economics publications
    Citation
    Department of Economics, 2007
    Rights
    OpenAccess.
    This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License.
    Collections
    • Economics publications (MU)

    Send Feedback
    hosted by University of Missouri Library Systems
     

     


    Send Feedback
    hosted by University of Missouri Library Systems