HOME    »    SCIENTIFIC RESOURCES    »    Volumes
Abstracts and Talk Materials
Mathematical Modeling in Industry XVI — A Workshop for Graduate Students
June 18 - 27, 2012

Izhak (Zachi) Baharav (Corning Incorporated)

Team 1: Touch sensing, Silhouettes, and “Polygons-of-Uncertainty”
June 18, 2012

Touch interfaces for small-size consumer devices are becoming ubiquitous, and are now penetrating into new areas like large display-walls, collaborative surfaces, and more. However, different methods of sensing are called for in order to deal with the economics of touch-interface for a very large surface. One such method uses small number of cameras and multiple light sources, as depicted in the example in Fig 1 below. When an object is placed on the surface, it blocks the line of sight between a few of the light sources and various cameras, and thus creates silhouette images. Using the input from all cameras, one can try and reconstruct the shape and location of the object touching the screen.

Of course, one can use more cameras, and various light/camera arrangements, to get different performances. Things get a little more complicated when we consider non-Euclidean surfaces (tracking on a ball?).

There is plenty of current research into Shape From Silhouettes (SFS), especially in order to reconstruct a 3D shape. Our case might seem simpler, as it is 2D only, but it has many practical requirements to address: Limited number of camera-views, minimization of light-sources, and so on. Moreover, the specifications we have for the system consider (for example) resolution, minimum detectable object, and how close two-objects can be together and still detected.

In this work we will build the tools to analyze, using analytic-geometry, various combinations of surface-shapes, cameras, light sources, and objects touching the surface. We will then venture into related aspects, depending on the inclination and composition of the team:
  1. Optimization problem: Given specification for performance, what is the minimal number of cameras/light sources (with associated cost function) to achieve these ? Where is the location of these?
  2. Analytic aspects: Can we quantify average numbers for the performance? Or can we quantify the information-content in the silhouettes?
  3. Non-Euclidean surfaces: What about flexible surfaces, or balls?
  4. Robustness: How robust our solution is to mal-functioning light-source, or camera?
The results have immediate implications to design, performance, and cost of such systems.

Figure 1: Sample system for touch sensing (see details in the abstract), and a finger (object) on the surface.


From all members: - Analytic geometry - Some familiarity with computer graphics will help as well. Many similarities exist.

At least from some of the group members: - Ability to simulate using Matlab, Mathematica, or any similar tool.

Bibliography: A basic reference:
  • “The Visual Hull Concept for Silhouette-Based Image Understanding”, Aldo Laurentini, IEEE Transactions on Pattern Analysis and Machine Intelligence , 16(2) , 150 – 162, 1994.

  • Some more recent works:
  • “Towards Removing Ghost-Components from Visual-Hull Estimations”, Michoud, B. , Guillou, E. , Bouakaz, S. , Barnachon, M. , Meyer, Fifth International Conference on Image and Graphics, 2009. ICIG '09, 20-23 Sept. 2009, 428 - 434
  • “Fast Joint Estimation of Silhouettes and Dense 3D Geometry from Multiple Images”, Kolev, Kalin; Brox, Thomas; Cremers, Daniel; IEEE Transactions on Pattern Analysis and Machine Intelligence , 34(3) , 493 – 505, 2012.

Jon Downton (CGGVeritas)

Team 3: Azimuthal elastic inversion for fracture characterization
June 18, 2012

Keywords of the presentation: Fracture, AVAz, AVO, inversion, Optimization, Uncertainty analysis

An increasing amount of the oil and gas produced in North America and the world is from unconventional reservoirs. Understanding the fractures within the reservoir plays an important role in developing this resource. The objective of this project is to infer from P-wave seismic amplitude variations with offset and azimuth (AVAz) the elastic parameters of the earth and then from these elastic parameters characterize the fractures (Figure 1). The forward model is primarily described by two key models, the first describes the functional relationship between the anisotropic elastic parameters and the fractures, and the second describes the AVAz. One possible way of modeling this is to use linear slip theory [7,8] to characterize the fractures and then use a linearized approximation of the Zoeppritz equations [5] to describe the AVAz. It is then possible to estimate the fractures by inverting the nonlinear forward problem using simulated annealing [2].

Figure 1: Fracture direction and magnitude displayed for a carbonate reservoir

The inversion problem is nonlinear, under-resolved and ill-conditioned. In order to make the problem better posed and resolved, assumptions are typically made about the type and complexity of the fractures [4,6]. One of the goals of this project is to understand the resolvability of these models and their parameterizations under different noise conditions. Another goal is to explore different methods to solve this nonlinear problem. Under certain data and parameter transformations [3] it is possible to linearize certain aspects of this problem. For this portion of the problem it is possible to perform a traditional parameter and data resolution analysis [1,4]. In this workshop we would like to explore techniques to understand the resolvability of the parameters for the full nonlinear problem.


We expect students with a strong background in optimization, numerical analysis, and good computing skills (MatLab or C/C++). Knowledge of statistical methods and stochastic analysis would be an asset.

  1. G. Backus, and F. Gilbert, “Numerical applications of a formalism for geophysical inverse problems,” Geophysical Journal of the Royal Astronomical Society 13 (1967): 247-276
  2. J. Downton, and B. Roure, “Azimuthal simultaneous elastic inversion for fracture detection,” SEG, Expanded Abstracts 30 (2010), 269-273
  3. J. Downton, B. Roure, and L. Hunt, “Azimuthal Fourier Coefficients,” CSEG Recorder 36, no. 10, (2011): 22-36.
  4. M. Eftekharifar and C. M. Sayers, “Seismic characterization of fractured reservoirs: A resolution matrix approach,” SEG, Expanded Abstracts 30, (2011):1953
  5. I. Pšenčík, and J. L. Martins, “Properties of weak contrast PP reflection/transmission coefficients for weakly anisotropic elastic media,” Studia Geophysica et Geodaetica 45 (2001): 176-197.
  6. C.M. Sayers, and S. Dean, “Azimuth-dependent AVO in reservoirs containing non-orthogonal fracture sets,” Geophysical Prospecting 49 (2001): 100-106.
  7. M. Schoenberg, “Elastic behaviour across linear slip interfaces,” Journal of the Acoustical Society of America 68, no. 5, (1980):1516–1521.
  8. M. Schoenberg and C. M. Sayers “Seismic anisotropy of fractured rock,” Geophysics 60 (1995): 204–211
  9. .

Ritchie (Yeqi) He (Royal Bank of Canada)

Team 7: Valuation of Over-the-Counter Derivatives with Collateralization
June 18, 2012

Keywords of the presentation: Quantitative modeling, derivative pricing, collateralization, funding cost

More and more over-the-counter (OTC) derivatives traded post-crisis are collateralized in order to reduce counterparty’s credit risk or to be required in clearing houses according to new regulation. In this case, financial institutions need to incorporate the cost to raise capital for funding the collateral request into the traditional (uncollateralized) valuation models to find the fair value of the derivatives, which attracts large amount of attention and interest in funding value adjustment (FVA). If the OTC derivatives are perfectly collateralized, an OIS discounting is sufficient for the correct valuation methodology as pointed out in [5]. However, in practice, collateralization under the credit support annex (CSA) may be imperfect, such that OIS discounting is not necessarily a suitable valuation method any more.

We are planning to perform some quantitative impact study on valuation OTC derivatives with imperfect collateralization. In particular, an interesting topic is the cross currency collateralization, which is widely applied in practice. In the simplest case, the collateral may be posted in a pre-determined currency different from the derivative itself, this may lead to certain quanto effect in valuation [3]. A further study will be for the case that collateral can be chosen among a set of currencies such that the party to post collateral will choose the one with minimum funding cost, which leads to the complexity of the cheapest-to-deliver option similar to bond futures [2]. It is worth noting that this is an American or Bermudan style option, such that one may need to apply the valuation approaches in [1,4]. We will attempt to model these problems, develop valuation methodologies, and obtain some numerical results for the problems during this workshop.


Stochastic analysis, financial modelling, derivative pricing, computer coding (C++ or Matlab) Desired: numerical optimization

  1. L. B. G. Andersen , “A Simple Approach to the Pricing of Bermudan Swaptions in the Multi-Factor Libor Market Model”, March 5, 1999, http://ssrn.com/abstract=155208
  2. P. Carr, and R. R. Chen, “Valuing bond futures and the quality option”, Technical report, 1997.
  3. M. Fujii, Y. Shimada, and A. Takahashi, “Collateral Posting and Choice of Collateral Currency – Implications for Derivative Pricing and Risk Management”, May 8, 2010, http://ssrn.com/abstract=1601866
  4. F. A. Longstaff z and E. S. Schwartz, “Valuing American options by simulation: a simple least-squares approach”, Rev. Financ. Stud. (2001) 14 (1), 113-147.
  5. V. Piterbarg, “Funding Beyond Discounting: Collateral Agreements and Derivatives Pricing”, Risk Magazine, February 2011, 97–102.

Anthony José Kearsley (National Institute of Standards and Technology)

Team 4: Identifying sugars
June 18, 2012

Sugars, the collection of all naturally-occurring monosaccharides and disaccharides, are small molecules belonging to the class of carbohydrates. Examples of sugars are glucose, fructose, and sucrose. Essentially all sugars have the same chemical formula but different molecular structures. Virtually all metabolites found in bodily fluids (primarily blood, urine, and saliva) can be identified and quantified using gas chromatography – mass spectrometry (See Figure 1). However, sugars cannot; they must be identified using techniques other than mass spectrometry. This is because, even under ideal conditions, the mass spectra of sugars are very similar, though not identical. (See Figure 2.) In a real world laboratory setting lack of sufficient sample and interferences from other molecules in the bodily fluid create noise and uncertainty in the mass spectra. This makes identifying sugars in real samples difficult. A positive advancement in the field of clinical analysis would be the ability to identify sugars in bodily fluids by GC-MS. This would avoid the need to perform completely separate experiments to determine sugar content.

GC-MS identifies components of complex mixtures such as bodily fluids by vaporizing the sample and forcing the vapor through a capillary column having an absorptive inner lining. As the substance passes through the column different molecules elute at different times due to differences in each molecule’s thermodynamic gas/liquid partition function. The molecules are then sent to the mass spectrometer. Here they are ionized by electron impact. The high energy electron beam breaks the molecules apart and their characteristic mass spectrum is measured. Thus, a mass spectrum is a collection of mass and intensity pairs which can be plotted. This plot (or spectrum as in Figure 1) is then compared to a database of existing spectra and the compound is identified. Identifying sugars however, is a notoriously difficult problem.

Given the spectra of an unknown sugar, can a searching function be constructed that successfully identifies the sugar from a collection of known sugar spectra? Given a known sugar with known chemical structure and given a collection of mass spectra collected on different instruments and with varying degrees of accuracy, can one model the noise or error associated with an instrument? Can one derive necessary or sufficient conditions for an unknown sugar to be identifiable (or not identifiable)? Finally, can one select instrument settings that produce spectra that, when compared to library spectra, minimize the maximum likelihood of a incorrectly identifying an unknown?

This project will involve investigating the mathematical and statistical techniques for estimating the distance between chemical spectra of varying degrees of accuracy and searching functions designed to identify sugars. Other classes of substances may also be considered (pesticides, pollutants, methamphetamines).


Interest in computing (Matlab or C/C++). Background in optimization or statistics; some linear algebra. Special interest in things like regression, machine learning, filtering methods...etc would be beneficial but not necessary.


W. Demuth, M. Karlovits, K. Varmuza. ``Spectral similarity versus structural similarity: mass spectrometry’’, Analytica Chimica Acta, 2004. pp 75—85.

NIST, Mass Spectral Database 2011, National Institute of Standards and Technology, http://www.nist.gov/srd/nist1a.htm, Gaithersburg, MD, 1998

S. Stein. ``Chemical substructure identification by mass spectral library searching’’, Journal of the American Society for Mass Spectrometry’’, 1995. pp 644-655.

Laura Lurati (The Boeing Company)

Team 5: Multi-objective design of a fuel tank
June 18, 2012

Keywords of the presentation: optimization, visualization

Algorithms for design optimization are increasingly able to handle complex problem formulations. We will consider the design of a fuel tank consisting of four different disciplinary sub-system components- structures, aerodynamics, cost, and systems. This is a multi-disciplinary, design problem with multiple competing objectives. We will examine several formulations and how to best match the problem formulation with the choice of optimizer. As the complexity of the problem grows, so does the amount of data generated during an optimization. This adds on the challenge of interpreting the data. We will investigate different methods for representing the data. Visualization methods will be considered in two groups- those suited to developing a greater understanding of the problem (for team members) and those suited to presenting results (for customers).

This project is intended to give a flavor of what an industrial mathematician does throughout the lifecycle of a given project. This project has 3 steps:
  1. Model the various components representing each discipline (some components will be provided) and link them together into a system.
  2. Implement solution to multi-objective optimization problem, including choosing which problem formulation to use and which optimization libraries to use.
  3. Explore various methods of visualizing data to best communicate differences between designs.
Prerequisites: All team members must have some computing skills (Matlab, Java, Python, or C++). Some familiarity (or interest) in design optimization is desired.

References: Schuman, T., De Weck, O., Sobieski, J. (2005) Integrated System-Level Optimization for Concurrent Engineering with Parametric Subsystem Modeling AIAA 2005-2199.

De Weck, O. (2004), Multidisciplinary System Design Optimization (MSDO): Decomposition and Coupling, Module 6 Notes, MIT OpenCourseWare 16.888 / ESD.77, Massachusetts Institute of Technology.

Cramer, E. J., J. E. Dennis, Jr., P. D. Frank, R. M. Lewis, G. R. Shubin (1994), Problem formulation for multidisciplinary optimization, SIAM Journal of Optimization 4 (4): 754-776.

Efrossini Tsouchnika (Siemens AG)

Team 2: Validation of Service Concepts for Oil Drilling by Simulation
June 18, 2012

Keywords of the presentation: simulation, optimization, operation and maintenance, availability and risks

Significant uncertainty and risk is associated with operation and maintenance costs for oil drilling. The requirement for service concepts is to guarantee high availability and productivity at low service costs during the service period. Failures of the oil rigs, the number of service technicians, the position of the home base and the availability of spare parts are the relevant parameters influencing the service costs. In addition various strategies combining scheduled and unscheduled maintenance can be considered. To support this process, simulation methods can be used to establish and optimize operation and maintenance strategies. From the simulation of operation the outcome must be optimum transport logistic set-up, the optimum number of technicians to cover the service and warranty work in order to get the highest production output with the lowest costs.

The goal of the project is to develop a simulation software that determines service concepts for oil drilling that are optimal with respect to high availability and low costs. The following aspects will be modeled for this project:
  1. Failure prediction
  2. Logistic strategy
  3. Manpower management
  4. Spare parts management
Prerequisites: Required: statistics, linear Optimization, computing skills Desired: Discrete Optimization, Graph Theory

References: G.L. Nemhauser, L.A. Wolsey, Integer and Combinatorial Optimization, John Wiley & Sons, 1988

Masoud Zaripisheh (University of California)

Team 6: Interactive Treatment Planning in Cancer Radiotherapy
June 18, 2012

Keywords of the presentation: Radiotherapy, Treatment Planning, Multi-objective Optimization

Cancer is the second cause of death in USA with estimated deaths of 570,000 in 2010. In USA, about 2/3 of cancer patients are treated with radiotherapy since it has proven a particularly effective treatment for many cancer types. Radiation is generated by a medical linear accelerator mounted on a gantry that can deliver the radiation to the patient’s body from various orientations with optimized intensity profiles of the x-ray beams (Fig-1). The main objective of radiotherapy is to deliver a lethal dose of radiation to the tumor to kill cancerous cells while sparing surrounding healthy organs and normal tissues. The treatment is complex and very patient specific. The radiation beam parameters have to be tailored to each patient's case, through a process called treatment planning, where an optimal treatment plan is designed for a particular patient based on the patient's CT image data and the physician's prescription.

Fig-1: A Medical Linear Accelerator

Fig-2: DVH Curves

(Cumulative) Dose Volume Histogram (DVH) is the most common tool employed by physicians to evaluate the quality of the plan (Fig-2). Point (D, V) on a DVH curve means for this organ, V (%) of the volume receives radiation dose more than D (Gy). Treatment planning is an multiple objective optimization problem, - delivering the desired radiation dose to the target (PTV) while minimizing dose to each healthy organ. We propose to use an interactive planning method to solve this problem. The physician will adjust the tradeoff among the target and the organs from an initial treatment plan. When the physician is satisfied with the DVH curves for some organs, s/he will "lock" these curves and keep looking to improve/modify the others. In this project we aim to develop a mathematically innovative and computationally efficient method to solve this important clinical problem.


Strong background in optimization, Good computing skills (MatLab or C/C++).

  1. H. Edwin Romeijn and James F. Dempsey, “Intensity modulated radiation therapy treatment plan optimization,” TOP 16 (November 4, 2008): 215-243.
  2. Yong Yang and Lei Xing, “Inverse treatment planning with adaptively evolving voxel-dependent penalty scheme,” Medical Physics 31 (2004): 2839.
  3. Chuan Wu et al., “Treatment plan modification using voxel-based weighting factors/dose prescription,” Physics in Medicine and Biology 48 (August 7, 2003): 2479-2491.

Connect With Us: