Concept

Stochastic dynamic programming

Summary
Originally introduced by Richard E. Bellman in , stochastic dynamic programming is a technique for modelling and solving problems of decision making under uncertainty. Closely related to stochastic programming and dynamic programming, stochastic dynamic programming represents the problem under scrutiny in the form of a Bellman equation. The aim is to compute a policy prescribing how to act optimally in the face of uncertainty. A motivating example: Gambling game A gambler has 2,sheisallowedtoplayagameofchance4timesandhergoalistomaximizeherprobabilityofendingupwithaleast2, she is allowed to play a game of chance 4 times and her goal is to maximize her probability of ending up with a least 6. If the gambler bets bonaplayofthegame,thenwithprobability0.4shewinsthegame,recouptheinitialbet,andsheincreaseshercapitalpositionbyb on a play of the game, then with probability 0.4 she wins the game, recoup the initial bet, and she increases her capital position by b; with probability 0.6, she loses the bet amount $b; all plays are pairwise independent. On any play of the game, the gambler may not bet more money than she has available at the be
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related publications

Loading

Related people

Loading

Related units

Loading

Related concepts

Loading

Related courses

Loading

Related lectures

Loading