Internet Coding and Control

Monday, August 6, 2001 - 1:30pm - 2:30pm
Keller 3-180
John Doyle (California Institute of Technology)
Two of the great abstractions of the 20th century were the separation, in both theory and applications, of 1) controls, communications, and computing from each other, and 2) the systems level from its underlying physical substrate. This horizontal and vertical isolation of systems held both in practical applications and in academic research. It facilitated massively parallel, wildly successful, explosive growth in both mathematical theory and technology, but left many fundamental problems unresolved and a poor foundation for future systems of systems in which these elements must be integrated. For example, congestion control and routing in the current TCP/IP protocol suite is based on classical control methods which are extended in purely heuristic ways to the distributed internetworking setting. Improving the performance of existing networks as well as future networks of networks employing ubiquitous controls and computing will stretch these abstractions past their breaking point. While a unified theory of systems has been an appealing intellectual challenge for decades, it has only recently become both an urgent technological challenge, and a tangibly reachable research objective.

The unifying theme in this talk is the new concept of Highly Optimized Tolerance (HOT) that arises when deliberate robust design aims for a specific level of tolerance to uncertainty. The resulting robust, yet fragile features of HOT systems are high performance and high throughput, but potentially high sensitivities to design flaws and unanticipated or rare events. HOT provides a framework in which the previously fragmented mathematical tools of robust control, communications, computation, dynamical systems, and statistical physics are beginning to be unified and brought to bear on a variety of applications. For example, congestion due to bursty Internet traffic can be traced to HOT design of web layouts and protocols, a generalization of source coding that suggests novel new protocol designs. This treatment of Internet source coding is complemented by new robust and scaleable congestion control strategies that provide high QoS (Quality of Service) for bursty traffic with minimal changes at the IP layer and below. We hope this will lead to not only to better understanding and control of internet traffic, but should also facilitate distributed control of dynamical systems using networks. Similar insights have been obtained in domains as diverse as shear flow turbulence, biological signal transduction and gene regulation, forest ecology, cascading failures in power grids, financial market volatility, and the ubiquity of power laws in technological and natural systems.