Kluwer HomepageCustomer ServiceWebmasterOrdering InformationCatalogue HomepageSearch

Handbook of Markov Decision Processes

Models and Applications

edited by
Eugene A. Feinberg
SUNY at Stony Brook, USA
Adam Shwartz
Technion Israel Institute of Technology, Haifa, Israel

Contents and Contributors 
(links to introduction of each chapter)

1. Introduction; E.A. Feinberg, A. Shwartz.

Part I: Finite State and Action Models.

2. Finite State and Action MDPs; L. Kallenberg.
3. Bias Optimality; M.E. Lewis, M.L. Puterman.
4. Singular Perturbations of Markov Chains and Decision Processes
K.E. Avrachenkov, J. Filar, M. Haviv

Part II: Infinite State Models.

5. Average Reward Optimization Theory for Denumerable State Spaces; L.I. Sennott.
6. Total Reward Criteria; E.A. Feinberg.
7. Mixed Criteria; E.A. Feinberg, A. Shwartz.
8. Blackwell Optimality; A. Hordijk, A.A. Yushkevich.
9. The Poisson Equation for Countable Markov Chains: Probabilistic Methods and Interpretations; A.M. Makowski, A. Shwartz.
10. Stability, Performance Evaluation, and Optimization; S.P. Meyn.
11. Convex Analytic Methods in Markov Decision Processes
V.S. Borkar.
12. The Linear Programming Approach; O. Hernández-Lerma, 
J.B. Lasserre.
13. Invariant Gambling Problems and Markov Decision Processes
L.E. Dubins, A.P. Maitra, W.D. Sudderth

Part III: Applications.

14. Neuro-Dynamic Programming: Overview and Recent Trends
B. Van Roy.
15. Markov Decision Processes in Finance and Dynamic Options
M. Schäl.
16. Applications of Markov Decision Processes in Communication Networks; E. Altman.
17. Water Reservoir Applications of Markov Decision Processes
B.F. Lamond, A. Boukhtouta.
Index.

Order