HOME    »    PROGRAMS/ACTIVITIES    »    Annual Thematic Program
Abstracts for the 2005-2006 IMA/MCIM Industrial Problems Seminar

Industrial Programs

1:25 pm
Room 20 Vincent Hall

September 30, 2005, 1:25pm, Room 20 Vincent Hall
Assaf Naor (Microsoft Research)

Graph Partitioning, Clustering with Qualitative Information, and Grothendieck-type Inequalities
Talks(A/V)

In this talk we will describe applications of Grothendieck's inequality to combinatorial optimization. We will show how Grothendieck's inequality can be used to provide efficient algorithms which approximate the cut-norm of a matrix, and construct Szemeredi partitions. Moreover, we will show how to derive a new class of Grothendieck type inequalities which can be used to give approximation algorithms for Correlation Clustering on a wide class of judgment graphs, and to approximate ground states of spin glasses. No prerequisites will be assumed, in particular, we will present a proof of Grothendieck's inequality.

Based on joint works with N. Alon, K. Makarychev and Y. Makarychev.

October 7, 2005, 1:25pm, Room 20 Vincent Hall
Tin Kam Ho (Bell Labs, Lucent Technologies)

Geometrical complexity of Classification Problems
Talks(A/V)
Slides:  html   pdf   ppt

Pattern recognition seeks to identify and model regularities in empirical data by algorithmic processes. Successful application of the established methods requires good understanding of their behavior and also how well they match the application context. Difficulties can arise from either the intrinsic complexity of a problem or a mismatch of methods to problems. We describe some measures that can characterize the intrinsic complexity of a classification problem and its relationship to classifier performance. The measures revealed that a collection of real-world problems can span an interesting continuum between those easily learnable to those with no learning possible. We discuss our results on identifying the domains of dominant competence of several popular classifiers in this measurement space.

October 14, 2005, 1:25pm, Room 20 Vincent Hall
Jeff Remillad (Ford)

Groping in the Dark: The Past, Present, and Future of Automotive Night Vision
Talks(A/V)

Night vision was first introduced into the automotive market 5 years ago on the Cadillac DeVille, and then 2 years later on the Lexus LX 470 sport-utility-vehicle. In spite of initial consumer interest, Cadillac no longer offers this feature, and in general, night-vision has failed to generate significant interest in the marketplace. However, Honda has introduced a night-vision system based on the use of two thermal-cameras that also provides a pedestrian detection function, and BMW and DCX will be launching a night-vision option within the next year on the 7-Series, and S-Class, respectively. Which, if any, of these products will capture the imagination of the driving public, and what night-vision technology/feature package offers the best performance and business case? This talk will compare and contrast various approaches to automotive night vision, and specifically describe the laser-based system developed by Ford, which consists of a laser illuminator and CCD camera that are located in the vehicle interior. In contrast to thermal night vision technologies, it provides easily recognizable images of both pedestrians and inanimate objects and in contrast to the active-night-vision system offered by Lexus, it provides clear imagery even in the presence of oncoming traffic. The talk will also describe a prototype range-gated laser-based night-vision-system that enables viewing through snow, fog, and other obscurants.

October 28, 2005, 1:25pm, 20 Vincent Hall
Valerio Pascucci (Center for Applied Scientific Computing, Lawrence Livermore National Laboratory)

Robust Multi-Scale Morse Analysis for Scientific Data Analysis and Exploration
Talks(A/V)

In recent years topological analysis has emerged as a fundamental tool for extracting intrinsic geometric features present in scientific data. In this talk I will discuss the challenge of developing algorithms that enable the practical use Morse theory in the analysis in real scientific data, in other words algorithms that are robust, comprehensive, and multi-scale. Robust algorithms avoid the numerical instabilities that are particularly problematic when dealing with noisy data. A comprehensive analysis guarantees reliable data understanding and exploration. Multi-scale representations allow isolating features of interest at different resolutions.

I will illustrate in detail how these concepts can be translated systematically into efficient algorithms and data structures. The resulting approach is used for constructing Morse-Smale complexes, Contour trees, and Jacobi sets that are used in user interfaces components for general purpose data exploration. The same techniques are also used for specialized data analysis tools for time tracking of particles in combustion simulations, for the characterization of bubbles and spikes in Rayleigh-Taylor instabilities, and for the reconstruction of complex channel structures in porous media.

I will provide a live demonstration of the approach with a simple visualization tool with linked views coordinating the presentation of topology with traditional visualization techniques. More details regarding this work can be found at http://www.pascucci.org.

November 18, 2005, 1:25pm, 113 Vincent Hall (Note room change)
David Arathorn (General Intelligence Corporation)

The Map-seeking Method: From Cortical Theory to Inverse-Problem Method
Talks(A/V)
Slides:  pdf

The Map-Seeking Method provides a tractable solution for the inverse problem of discovering a composition of transformations that map one pattern to another. It was invented (or uncovered) as part of an attempt to hypothesize how the visual cortices solve the problem of transformation discovery, with the intent of applying the same principle to machine vision. To establish a base level of biological plausibility for this mathematical approach it was shown that there are reasonably realistic neuronal circuits that implement a functional equivalent of the algorithmic form of the method. As a result, the general implementation of the method has come to be known as the Map-Seeking Circuit (MSC).

Most of the research into practical application of the MSC has been directed at machine vision and image processing. However, during the last few years it has become apparent that the MSC is applicable to various classes of inverse problems partly or entirely outside of vision. Most recent focus has been on classes of problems which require concurrent solution in multiple domains or problem spaces, for which a minor variant of the original MSC provides an elegant and practical solution. These include both "brain-instinctual" tasks and "brain-taxing" problems, and a preliminary insight into the difference will be discussed.

February 3, 2006, 1:25pm, 20 Vincent Hall
Ruchira Datta (Google)

The Mathematics of Web Information Retrieval
Talks(A/V)

Classical information retrieval was built on the assumption that patient users are trying to extract information from a clean, trustworthy, manageable corpus and no adversaries are trying to get in the way. We can't rely on any of these assumptions in the era of the web. In this talk we'll see how leveraging mathematics and large-scale systems leads us to good web information retrieval. Finally we'll describe an interesting open problem with applications to computing PageRank.

February 24, 2006, 1:25pm, 20 Vincent Hall
Leo Grady (Siemens Corporate Research)

A general purpose segmentation algorithm using analytically evaluated random walks
Talks(A/V)

An ideal segmentation algorithm could be applied equally to the problem of isolating organs in a medical volume or to editing a digital photograph without modifying the algorithm, changing parameters, or sacrificing segmentation quality. However, a general-purpose, multilabel segmentation of objects in an image/volume remains a challenging problem. In this talk, I will describe a recently developed approach to this problem that inputs a few training points from a user (e.g., from mouse clicks) and produces a segmentation by computing the probabilities that a random walker leaving unlabeled pixels/voxels will first strike the training set. By exact mathematical equivalence with a problem from potential theory, these probabilities may be computed analytically and deterministically. The algorithm is developed on an arbitrary, weighted, graph in order to maximize the broadness of application. I will illustrate the use of this approach with examples from several segmentation problems (without modifying the algorithm or the single free parameter), compare the behavior of the algorithm to other approaches and discuss the theoretical properties that describe its behavior. Time permitting, new results on a solution to the combinatorial Plateau problem on a weighted complex will also be given, with application to 3D image segmentation. More information on this research may be found online at: http://www.cns.bu.edu/~lgrady/

March 24, 2006, 1:25pm, 20 Vincent Hall
Peter R. Massopust (Independent Consultant, Houston, Texas)

Mathematical Problems associated with a Class of Nondestructive Evaluations
Talks(A/V)
Slides:  pdf

Defects in magnetizable materials can be detected and evaluated by measuring electromagnetic data. The evaluation of this data relies on effective denoising techniques. In this talk, two types of denoising techniques for two different types of noise are presented. The first is based on wavelets and uses the stochastic denoising procedure of Donoho and Johnstone, the second one is based on curvelets and nonlinear approximation. Examples will be provided to demonstrate the issues involved and to validate the algorithms.

March 31, 2006, 1:25pm, 20 Vincent Hall
John F. Hamilton, Jr. (Eastman Kodak Company)

Algorithms for Digital Color Cameras
Talks(A/V)
Slides:  pdf

While digital color imaging has many problems in common with conventional silver halide imaging, it also has its own particular problems not faced in the analog world. Two of these problems, and two corresponding algorithmic solutions, are illustrated by example and discussed in detail. In addition, a mathematical perspective is presented to explain how these algorithms work.

The first problem is that of color interpolation (also called demosaicking). The pixels of most silicon sensors capture a single color measurement, usually a red, green, or blue color value. Because a fully processed color image requires all three color values at each pixel, two additional color values must be provided at each pixel. An algorithm is presented that addresses this problem for sensors using the Bayer color filter pattern.

The second problem is that of color aliasing. While the problem of aliasing is always present in a discrete imaging system, it is compounded for color sensors because the different color channels can alias in different ways. The resulting interference patterns have distinctive color components which constitute an obvious and annoying imaging artifact. An algorithm is presented that addresses this problem for sensors using the Bayer color filter pattern.

April 21, 2006, 1:25pm, 20 Vincent Hall
Aria Abubakar (Schlumberger-Doll Research)
Talks(A/V)

Imaging and inversion algorithms based on the source-type integral equation formulations

Joint work with Tarek M. Habashy (Schlumberger-Doll Research, USA) and Peter van den Berg (Delft University of Technology, The Netherlands).

In this presentation we present a class of inversion algorithms to solve acoustic, electromagnetic and elastic inverse scattering problems of the constitutive material properties of bounded objects embedded in a known background medium. The inversion utilizes measurements of the scattered field due to the illumination of the objects by a set of known wave-fields. By using the source-type integral equation formulations we arrive at two set of equation in terms of the contrast and the contrast sources (the product of the contrast and the fields). The first equation is the integral representation of the measured data (the data equation) and the second equation is the integral equation over the scatterers (the object equation). These two integral equations are solved by recasting the problem as an optimization problem. The main differences of the presented algorithms then other inversion algorithms available in the literature are: (1) We use the object equation itself to constrain the optimization process. (2) We do not solve any full forward problem in each iterative step. We present three inversion algorithms with increasing complexity, namely, the regularized Born inversion (linear), the diagonalized contrast source inversion (semi-linear) and the contrast source inversion (full non-linear). The difference between these three inversion algorithms is in the way we use the object equation in the cost functional to be optimized. Although the inclusion of object equation in the cost functional serves as a physical regularization of the ill-conditioned data equation, the inversion results can be enhanced by introducing an additional regularizer that can help in accounting for any a priori information known about the contrast profile or in imposing any constraints such as limiting the spatial variation of the contrast. We propose to use the multiplicative regularized inversion technique so that there is no necessity to determine the regularization parameter before the optimization is started. This parameter is determined automatically during the optimization process. As numerical examples we present some synthetic and real data inversion from oilfield, biomedical and microwave applications both in two and three-dimensional configurations. We will show that by employing the above approach it is possible to solve full non-linear inverse problem with large number of unknowns using a personal computer with a single processor.

April 28, 2006, 1:25pm, 20 Vincent Hall
Giovanni Di Crescenzo (Telcordia Technologies)

An Introductory Survey to Zero-knowledge Proofs and their Applications

The seemingly paradoxical notion of Zero-Knowledge Proof Systems, introduced by Goldwasser, Micali and Rackoff in the 80's, has received a great amount of attention in both the Cryptography and Computational Complexity literature. Very informally, a zero-knowledge proof is a method allowing a prover to convince a verifier of a statement without revealing any additional information other than the fact that the theorem is true. While the two requirements of 'convincing a verifier' and 'yet not revealing anything else' may seem hard to coexist, zero-knowledge proofs have found rigorous formulations and efficient instantiations in various settings. Furthermore, the general zero-knowledge methodology of revealing only the necessary minimal information in communication in the presence of adversaries has become a fundamental tool having wide applicability throughout Cryptography. In this talk we give an introductory survey to zero-knowledge proofs, their several variants and cryptographic applications.

May 5, 2006, 1:25pm, 20 Vincent Hall
Matthew Brand (Mitsubishi Electric Research Lab)

Shape and manifold modeling with nullspaces
Talks(A/V)
Slides:  pdf

The surface of one's face is a 2d manifold embedded in 3d space; the space of all faces and facial expressions is a manifold with perhaps a few hundred statistically significant dimensions. These are very useful manifolds. Inconveniently, we have access to these manifolds only through their unknown immersions in the much higher-dimensional space of all possible (retinal) images. This motivates the problem of characterizing a manifold from data — usually a sparse sample of discrete points, tangents, or local geodesic distances. I'll outline a remarkably straightforward method that uses nullspace computations to separate variation on the manifold from variation due to the immersion, and connect this solution to practical problems in data analysis, projective geometry, and graph embeddings. The talk will be peppered with examples from facial coding and synthesis, shape-from-silhouettes problems, acoustic modeling of vowels, and computer graphics.

Industrial Programs

Go