Provably Correct and Optimal Control Strategies for Dynamical Systems - Bridging the Gap between Computer Science and Control Design

Xu Chu (Dennis) Ding (Boston University)

When: September 26, 2pm

Where: room G2.91b

Abstract

Recent studies show promising possibilities that allow one to analyze and synthesize controllers for control systems satisfying specifications in rich and human-like languages. These methods are said to be formal or provably correct. This emerging field brings together approaches and tools from different areas such as verification in computer science and controller synthesis for dynamical systems.

In this talk I will present our recent work on formal synthesis of control strategies for dynamical systems that are both verifiable and optimal. The main goal of my work is to specify behavior for control systems in a high-level, expressive language, and provide the means to automatically convert the specifications into low-level control and communication primitives. We will use specification languages that are natural (close to human language), expressive and computable. I will discuss some of our recent results in this effort for deterministic systems, probabilistic systems (Markov Decision Processes), and distributed (multi-agent) systems. Moreover, I will explore the problem of introducing optimality into this framework. One of the first question would be how to characterize optimality for a system satisfying temporal logic specifications, as its trajectory is infinite? Another question would be how to balance or compromise between optimality and correctness.