Kempner Colloquium Abstracts

Spring 2007





Title:  Toward an Analytic Model of Computer Performance (Industrial Strength Calculus)
Speaker: Tom Kerrigan
Affiliation:  Intel
Time:   3:00pm, Friday, February 16
Location:  ECCR 155

Abstract:  

Modern computing is a little like sausage making, a complex encounter between ingredients (software) and grinder (hardware). To handle the complexity, engineers often fall back on rules of thumb to guide their designs. This talk presents two instances in which mathematical analysis enhance engineering practice.

  • A nonlinear cache model that yields cache miss rate as a function of cache size and a simple workload characterization. The core analysis extends to multiprocessor systems in which every processor is stealing stuff out of every other processor's cache.
  • A model of the dependence of instruction execution rate on read requests of memory. The key equation is of Wiener-Hopf type but lives on the space of probability distributions. The Contraction Mapping Principle leverages the solution.

These examples illustrate the general applicability of math analysis in the computer industry. The talk is accessible to essentially everybody.


Biography:
Tom Kerrigan graduated from CU (Ph.D., Math, 1977) just past the crest of the Baby Boom Tsunami and then spent the next thirty years just wandering about:
  • Battelle NW (the mathematics of really big windmills),
  • Drexel University (geophysical applications of stochastic diff. eq.),
  • AT&T Bell Labs (SDI, i.e., "Star Wars"),
  • Hewlett Packard (the PC revolution),
  • Intel (the tech-led stock market bubble and bust).
He presently lives in Oregon and spends his free time fantasizing about an empty nest.


Title:  A Brief Tour of Stein's Method
Speaker: Louis H.Y. Chen
Affiliation:  National University of Singapore
Time:   4:15pm, Monday, April 16
Location:  BESC 180

Abstract:  

One of the objectives of probability theory is calculating probabilities. But often exact calculation is either impossible or very tedious. In such situations, we resort to approximation. For sums of independent random variables (that is, in the case of convolutions of probability measures), the Fourier method has been an effective tool. However, without independence, the Fourier method is much more difficult to apply.

Stein's method, which was introduced by Charles Stein in the context of normal approximation in 1972, provides an effective tool for probability approximations for dependent random variables. Since dependence is the rule rather than the exception in applications, Stein's method has become increasingly important. From the theoretical standpoint, Stein's method is elegant, direct and works in a magical fashion.

Although Stein's method was originally invented for the normal approximation, its ideas are general enough for it to be extended and developed in many other contexts. These developments are also of theoretical importance as well as of interest in their own right. In this lecture, I will explain the basic ideas of Stein's method, discuss a few important developments since its first appearance, and give a few examples of its application.


Biography:

Louis Chen is Tan Chin Tuan Centennial Professor of Mathematics and Statistics, and is the Director of the Institute for Mathematical Sciences at the National University of Singapore. He was President of the Bernoulli Society for Mathematical Statistics and Probability during the period 1997-1999, and was the President of the Institute of Mathematical Statistics during 2004-2005. He has held visiting appointments at Simon Fraser University, Stanford University, the Massachusetts Institute of Technology, and the Université de Provence in France.





| Kempner Spring 2007 Page | CU Math Home |