Campuses:

optimal control strategies

Friday, May 11, 2018 - 9:00am - 9:50am
Tyrone Duncan (University of Kansas)
Stochastic differential games have been used as models for a wide variety of physical systems. These games are a natural evolution from some stochastic control problems. Two well known methods to find optimal control strategies for a stochastic differential game are solving Hamilton-Jacobi-Isaacs equations which are nonlinear partial differential equations or solving backward stochastic differential equations. Both of these approaches are often difficult to solve for explicit optimal control strategies.
Subscribe to RSS - optimal control strategies