Systems dynamics and differential equations Transfer functions and block diagrams State-space formulation Transient and steady state response analysis Stability Controllability and observability Multivariable feedback and pole location Introduction to optimal control Variational calculus Optimal control with unbounded continuous controls Bang-bang control Applications of optimal control Dynamic programming Appendices: Partial fractions Notes on determinants and matrices Solutions to problems.
{{comment.content}}