**Robust portfolio optimization using a cross sectional
factor model**Christopher Bemis (Whitebox Advisors)

September 11, 2009, 1:25pm,
570 Vincent
Hall

Video
(flv) Slides
** more...**
Active portfolio management has developed substantially since the formulation of the Capital Asset Pricing Model (CAPM). While the original methodology of portfolio optimization has been lauded, it is essentially an academic exercise, with practitioners eschewing the suggested weightings. There are myriad reasons for this: nonstationarity of data, insufficiency of modeling parameters, sensitivity of optimization to small perturbations, and assumption of uniform investor utility all indicate potential failures in the model.

We present historical market data to exhibit the pitfalls outlined above. From this analysis, we proceed to examine robust portfolio construction methods. One in particular is provided by Goldfarb and Iyengar. We adapt their methodology to a cross sectional model for returns and examine the performance we achieve.

**Optimization algorithms for applications in industry**Todd Plantenga (Sandia National
Laboratories)

October 23, 2009, 1:25pm,
570 Vincent
Hall

** more...**
The talk will describe the speaker's experience in designing
and implementing
optimization algorithms for applications in industry.
Discussion will focus
on a retail price optimization algorithm that uses stochastic
methods to handle
uncertainties. The project started with a textbook approach
(stochastic
optimization with recourse) and evolved into a special purpose
solution
suited to the business case. The speaker was directly involved
in gathering
requirements, prototyping a mathematical model and algorithmic
approach,
investigating issues with real world data, and writing
production software
for the final implementation. The completed project currently
recommends
optimal prices on over 10,000 items per day.

Dr. Plantenga obtained a PhD from Northwestern University in
1994 studying
large-scale nonlinear optimization methods. He has developed
optimization
software for PeopleSoft/Oracle, Gap Inc, Ziena Optimization,
and
his current employer, Sandia National Laboratories.

**Laser ribbon bond loop shape prediction and optimization**J. Michael Gray(Medtronic)

December 4, 2009, 1:25pm,
570 Vincent
Hall

pdf
doc
**Research in applied mathematics at Schlumberger**Lalitha Venkataramanan (Schlumberger)

December 18, 2009, 1:25pm,
570 Vincent
Hall

** more...**
The search for oil and
gas has three objectives: to identify and evaluate
hydrocarbon-bearing reservoirs; to bring hydrocarbons to the
surface safely and cost-effectively, without harming the
environment; and to maximize the yield from each discovery.
This talk will focus on some aspects of research in applied
mathematics in the area of nuclear magnetic resonance and its
application to the oilfield at Schlumberger.

** Mathematics at work in industrial research and development
laboratories**
Cristina U. Thomas
(Laboratory Manager, Safety, Security and Protection Business
Laboratory, 3M)

February 19, 2010, 1:25pm, 570 Vincent
Hall

** more...**
This seminar will focus
on the life of a Mathematician in a research and
development laboratory in an industrial setting. It will
describe a series
of applications of advanced mathematics without going into the
details of
the Math utilized in each application. It will give an idea of
the
challenges of problem solving or/and product development faced.
It will
also describe job functions and opportunities as a
Mathematician in a
Diversified Manufacturing Company.

**Toward optimal transport systems **
Natalia Alexandrov
(NASA Langley Research Center)

March 26, 2010, 1:25pm, 570 Vincent
Hall

** more...**
Strictly reactive,
evolutionary approaches to improving the air
transport system no
longer suffice in the face of the predicted growth in demand
and must be supplemented
with active design methods. The ability to actively design,
optimize and control a
system presupposes the existence of predictive modeling and
reasonably well-defined
functional dependencies among the controllable variables of the
system and objective
and constraint functions for optimization. We investigate
functional relationships that
govern the performance of transport systems with the aim of
arriving at substantiated
modeling, design optimization, and control methods.

** Applying numerical grid generation to the visualization of
complex function data **
Bonita V. Saunders
(Mathematical and Computational Sciences Division, National Institute of Standards and Technology)

April 9, 2010, 1:25pm, 570 Vincent
Hall

Video
(flv)
** more...**
Numerical grid
generation, that is, structured grid generation, is the
development of a generalized curvilinear coordinate system.
Originally designed for solving computational fluid dynamics
problems over oddly shaped domains, structured techniques have
competed with various unstructured methods such as Voronoi or
Delaunay triangulations and quadtree designs. However, the
effectiveness of a given grid often depends on how it is used.
For complex function visualization problems, the grid
generation technique may be less important than how closely the
grid lines follow the contours of the function. This talk looks
at the use of a tensor product B-spline mapping to generate a
boundary/contour fitted mesh that captures significant
attributes such as zeros, poles, branch cuts and other
singularities when the mesh is used to plot a complex function
surface.

This work has been used to create over 200 interactive 3D
visualizations of complex function surfaces for the NIST
Digital Library of Mathematical Functions (DLMF). The NIST DLMF
and its hardcopy version, the NIST Handbook of Mathematical
Functions, will replace the well-known NBS Handbook of
Mathematical Functions edited by Abramowitz and Stegun and
first published in 1964.

** Optimal chemical spectroscopy**Anthony JosÃ© Kearsley
(Mathematical and Computational Science Division,
National Institute of Standards and Technology)

April 30, 2010, 1:25pm, 570 Vincent
Hall

Video
(flv)
** more...**
Chemical spectroscopy is an invaluable tool in commerce,
public safety, health care, national security, and scientific
research. In most cases a measurement expert with
considerable experience using an ill-defined catalog of
heuristic rules is required to optimize the instrument and to
interpret the data. Using an expert is expensive, not always
reproducible, and introduces uncontrolled bias to the
measurement process. Numerical algorithms for nonlinear
optimization can supplement, and perhaps even replace, the
knowledge of the expert operator. I will discuss my work with
one of the most complex and fastest growing of the new
chemical spectroscopies: matrix assisted laser
desorption/ionization time of flight mass spectrometry (MALDI
TOF MS). This technique finds broad application in both the
biological and materials sciences, from drug discovery to the
development of high performance plastics. I will demonstrate
the application of an optimization method applied to the analysis of a synthetic polymer,
one of the most difficult species to analyze by MALDI TOF
MS.

**Identifying and quantifying uncertainty in computational models**
Genetha Anne Gray
(Predictive Simulation Research and Development, Sandia National Laboratories)

May 7, 2010, 1:25pm, 570 Vincent
Hall

Video
(flv)
** more...**
Despite their obvious
advantages, computer simulations also introduce
many challenges. Uncertainties must be identified and
quantified in
order to guarantee some level of predictive of a computational
model.
Calibration techniques can be applied to both improve the model
and to
curtail the loss of information caused by using simulations in
place of
the actual system.

Evaluating and calibrating a computer model is dependent on
three main
components: experimental data, model parameters, and
algorithmic
techniques. Data is critical as it defines reality, and
inadequate data
can render model evaluation procedures useless. Model
parameters should
be studied to determine which affect the predictive
capabilities of
model and which do not. Those that do are subject to
calibration.
Techniques for model evaluation and calibration should both
sufficiently
sample the design space and limit the computational burden.

In this talk, we will discuss the problems inherent in model
calibration
and validation processes in terms of data, parameters and
algorithms.
In particular, we will focus on suitable techniques of
optimization and
statistics for some specific numerical models from the areas of
electrical engineering, medicine, and groundwater control. We
will
demonstrate some successful and some not so successful
approaches.

Previous
Industrial Problems Seminars