This book offers a systematic introduction to the optimal stochastic control theory via the dynamic programming principle, which is a powerful tool to analyze control problems. First we consider completely observable control problems with finite horizons. Using a time discretization we construct a nonlinear semigroup related to the dynamic programming principle (DPP), whose generator provide…
The purpose of this book is to provide an introduction to stochastic controls theory, via the method of dynamic programming. The dynamic programming principle, originated by R. Bellman in 1950s, is known as the two stage optimization procedure. When we control the behavior of a stochastic dynamical system in order to optimize some payoff or cost function, which depends on the control inputs…