This talk addresses the problem of detecting and tracking large numbers of non-cooperative targets in a cluttered background. The usual approach, which is computationally intractable in general, would be to attempt to detect and track each and every target or potential target. The proposed approach uses the opposite strategy: it attempts to track only what is knowable (initially, geometrical shape and target density) and only later attempting to resolve individual targets out of the "multitarget background" as (and if) more data becomes available. From a mathematical point of view the approach is novel because the multitarget scenario is modeled as a random measure (specifically, a multidimensional random point process) and the optimal (but intractable) recursive Bayes filter is approximated by propagating the first moment measure (more accurately, its density function) instead of the full multitarget posterior density function.
Statistics in New Product Development
Taking a new product from idea to reality requires many difficult steps. Among them are the identification and optimization of the concept and product, and the startup and refinement of the associated manufacturing process. "Statistics" - including statistical thinking, experimental design, statistical methods, and statistical computing - plays a vital role in new product development. Using a series of examples, this talk will highlight the way in which statistics (and statisticians) can contribute to the development of successful new consumer products.
Wave Propagation in an Optical Fiber
Recent trends in optical communications show an increase in device integration along with a decrease in device size. Photonic crystals (PC) may be the platform of future miniaturize optical devices because they can control the light in sizes of the order of the wavelength.
The theoretical tools needed to study PC will be presented. Results for both two dimensional slab PC and three dimensional PC will be shown and the advantages of each case will be discussed.
Ann E. DeWitt (3M Research and Development) firstname.lastname@example.org
Advances in tools to probe biological phenomena such as combinatorial chemistry, high-throughput screening, genomics and proteomics have, in part, resulted in a rapid rise in the rate at which information is collected. The corresponding increase in the volume of information supplies a rich source for understanding how biological systems operate, but appropriate methods for placing each new piece of information into a larger context must be developed. Certainly mathematics have been applied to the investigation of biological systems in the past, and further opportunities arise from the need to organize and understand vast amounts of information, and to, furthermore, systematically, quantitatively capture behavior for predictive engineering.
This presentation will focus on how mathematics is used as a data analysis and predictive engineering tool to understand biological processes (i.e. life!), including a general introduction to the emerging discipline of "systems biology." Doctoral research conducted at Massachusetts Institute of Technology will be used for illustration along with examples from current research conducted in 3M Pharmaceuticals.
Senior Research Engineer
Software, Electric & Mechanical Systems Technology Center/Pharmaceuticals
3M Research and Development
Chemical Engineering, 2001
Massachusetts Institute of Technology, Cambridge, MA
Thesis advisor: Douglas A. Lauffenburger
Chemical Engineering, 1996
University of Illinois, Urbana-Champaign, IL
DeWitt, Ann E., T. Iida, H. Lam, V. Hill, H.S. Wiley, D.A. Lauffenburger. Affinity Regulates Spatial Range of EGF Receptor Autocrine Ligand Binding. Developmental Biology, 2002, v250; pp. 305-316.
DeWitt, Ann E., H. S. Wiley, D. A. Lauffenburger. Quantitative Analysis of the EGF Receptor Autocrine System Reveals Cryptic Regulation of Cell Response by Ligand Capture. Journal of Cell Science, June 2001, v114; pp. 2301-13.
Stephen Mildenhall (Kemper Insurance) Stephen.Mildenhall@kemperinsurance.com
The Evolution of Property-Casualty Insurance Liabilities
Property-Casualty insurance liabilities, related to claims from automobile accidents, house fires, liability claims, etc., are characterized by reporting and settlement lags which can be several years long. As a result, the liabilities and loss payments from a given set of insurance policies evolve over time, with payments gradually increasing to their ultimate settlement values. Actuaries use aggregate loss distributions (random sums) to model ultimate settlement values but there is no established way of decomposing ultimate losses into losses paid each year. This talk will explain how the negative multinomial distribution can be used to decompose ultimate losses into losses by year, and show that the resulting decomposition has empirically desirable properties. Next, we will discuss a Markov-chain model of claim complexity, which can be combined with the decomposition result, in order to produce a model with increasing average claim severity over time, a phenomenon observed in most lines of insurance. The Markov-chain model is an interesting departure from traditional actuarial analyses because it uses detailed cross-sectional data rather than long-term summary data.
Lawrence C. Cowsar (Bell Laboratories, Lucent Technologies)
Raman Amplified Optical Transport Systems
Optical transport system capacity has outpaced Moore's law over the past two decades. The pace continues unabated as a new generation of commercial products based on Raman amplification are being introduced. This talk will focus on some of the simulation challenges that arise in the design, control and testing of this next generation of optical transport.
Lili Ju (IMA Industrial Postdoc) email@example.com
Cortical Surface Flattening Using Discrete Conformal Mapping with Minimal Metric Distortion
Although flattening a cortical surface necessarily introduces metric distortion due to the non-constant Gaussian curvature of the surface, the Riemann Mapping Theorem states that continuously differentiable surfaces can be mapped without angular distortion. Several techniques have been proposed for flattening polygonal representations of surfaces while substantially minimizing metric distortion, and methods for conformal flattening of polygonal surfaces have also been proposed. We describe an efficient method for generating conformal flat maps of triangulated surfaces while minimizing metric distortion within the class of conformal maps. Our method, which controls both angular and metric distortion, involves the solution of a linear system and a small scale nonlinear minimization. It can be applied to user-defined "patches" or to an entire cortical surface.
Wade S. Martinson (Process Solutions Technology Development Center, Cargill Inc.) firstname.lastname@example.org
The Differentiation Index and Industrial Dynamic Simulation Slides: pdf
The differentiation index has become an important tool for understanding systems of coupled differential and algebraic equations, referred to as DAEs in the process simulation community. More recently, this concept has been extended to coupled partial differential and algebraic equations. In this talk, a simulation problem from the chemical processing industry will be used to illustrate how high index model formulations lead to practical problems with dynamic simulation, and how index analysis can lead to model reformulations that permit successful simulation.
Diagnostic Ultrasound: Technology and Applications
Ultrasound has developed over the past 50 years into a major diagnostic imaging modality, complementing CT, MRI and nuclear imaging. Major applications of ultrasound today include cardiovascular, abdominal organs, muskloskeletal, small parts, and OB/Gyn. Increased clinical usage of ultrasound has been driven by technological advances that exploit the following advantages compared to other modalities: real-time (especially important for heart and blood flow), safe due to non-ionizing radiation, portable, and low cost. Basic ultrasound modes include B-mode that images the acoustic reflectivity of tissue structures and Doppler that measures blood velocity. Recent advances include harmonic imaging that improves image quality by exploiting the nonlinear behavior of high-amplitude ultrasound propagation in tissue or micro-bubble contrast agents, and code technology that circumvents traditional resolution / penetration tradeoffs. Future directions for ultrasound are at the intersection of clinical needs (image quality, new applications, and increased productivity) and major technological trends (miniaturization, SW), which include miniaturized systems and probe components, improved image quality, new imaging parameters, and 4D imaging.
Nicholas Bennett (Schlumberger Doll Research) email@example.com
Posterior Uncertainty in Decimated Wavelet Model Parameterizations
Solving a geophysical inverse problem means determining the parameters of an earth model given a set of measurements. In solving many practical inverse problems, accounting for the uncertainty of the solution is very important to aid in decision-making. In this work, we address the problem of determining the posterior uncertainty of the solution for models that arise from decimated wavelet bases using a simple 1-dimensional seismic travel time inversion problem.
Our inversion methodology is to pick a model decimation, prepare a prior mean and covariance matrix of the wavelet coefficients, compute a posterior mean and covariance, and then to sample from this posterior distribution. We also sample different choices of model decimation in proportion to their posterior probability. These samples span the uncertainty of the inverse problem solution, accounting for both the uncertainty in the choice of model decimation and of wavelet coefficients. We note that a re-normalization of the decimated prior covariance matrix of the wavelet coefficients is required to properly account for the amount of variance in the prior distribution. Further, we present a fast algorithm for computing this normalized decimated prior covariance matrix.
Daniel R. Baker (General Motors R&D Center, 30500 Mound Rd., MC 480-102-000, Warren, MI 48090-9055) firstname.lastname@example.org
Impedance as a Diagnostic Tool for Studying Fuel Cells
We will start by showing some CFD simulations of current distribution on a fuel cell. The current distribution behaves differently under different operating conditions and we will try to explain this behavior. This will lead us to consider impedance spectroscopy as a tool to investigate some of the critical effects that impact current distribution. A short explanation of impedance methods will be given along with a discussion of how to interpret impedance data in the context of current distribution. Special emphasis will be given to the high frequency resistance (HFR) as a tool for understanding membrane humidification. Other impedance applications include assessing proton resistance in the porous cathode, kinetic resistance of the cathode electrode, and the relative contribution of gas transport resistance to voltage losses.
2003, 2:30 pm Tuesday, Rm 409 Lind Hall (note
special time and
Andrew Mullhaupt (S.A.C. Capital Management)
Connect With Us: