This book deals with the optimal control of solutions of fully observable It?-type stochastic differential equations. The validity of the Bellman differential equation for payoff functions is proved and rules for optimal control strategies are developed.
Topics include optimal stopping; one dimensional controlled diffusion; the Lp-estimates of stochastic integral distributions; the existence theorem for stochastic equations; the It? formula for functions; and the Bellman principle, equation, and normalized equation.