Campuses:

Reception and Poster Session

Wednesday, September 7, 2011 - 4:40pm - 7:00pm
Lind 400
  • Locally Low-Rank Promoting Reconstruction Strategies for

    Accelerated Dynamic MRI Series Applications

    Joshua Trzasko (Mayo Clinic)
    Several recent works
    have suggested that dynamic MRI series reconstructions can be
    significantly improved by promoting low-rank (LR) structure in the
    estimated image series when it is reshaped into Casorati form (e.g., for
    a 2D acquisition, NxNxT series -> N^2xT matrix). When T<< N2, the rank
    of the (reshaped) true underlying image may actually be not much less
    than T. For such cases, aggressive rank reduction will result in
    temporal/parametric blurring while only modest rank reduction will fail
    to remove noise and/or undersampling artifact. In this work, we propose
    that a restriction to spatially localized operations can potentially
    overcome some of the challenges faced by global LR promoting methods
    when the row and column dimensions of the Casorati matrix differ
    significantly. This generalization of the LR promoting image series
    reconstruction paradigm, which we call Locally Low Rank (LLR) image
    recovery, spatially decomposes an image series estimate into a
    (redundant) collection of overlapping and promotes that each block, when
    put into Casorati form, be independently LR. As demonstrated for
    dynamic cardiac MRI, LLR-based image reconstruction can simultaneously
    provide improvements in noise reduction and spatiotemporal resolution
    relative to global LR-based methods be practically realized using
    efficient and highly parallelizable computational strategies.
  • Multivariate Empirical Mode Decomposition
    Danilo Mandic (Imperial College London)
    Recent advances in sensor and data acquisition technologies have brought to light new classes of signals containing typically several data channels. Currently, such signals are almost invariably processed channel-wise, thus, not making use of their full potential. It is, therefore, imperative to design multivariate extensions of the existing nonlinear and nonstationary analysis algorithms, as they are expected to give more insight in to the dynamics and the interdependence between the multiple channels of the signal in hand. To this end, multivariate extensions of empirical mode decomposition algorithm and their advantages with regards to multivariate nonstationary data analysis are presented. Some important properties of such extensions are also explored ,including their ability to exhibit wavelet-like dyadic filter bank structures for white Gaussian noise (WGN), and their capacity to align similar oscillatory modes from multiple channels. Owing to the generality of the proposed methods, an improved multivariate EMD-based algorithm is introduced which solves some inherent problems in the original EMD algorithm. Finally, to demonstrate the potential of the proposed methods, simulations on real world signals (wind, inertial motion data, and RGB images) are presented to support the analysis.
  • An Optimization-Based Empirical Mode Decomposition
    Boqiang Huang (Universität Paderborn)Angela Kunoth (Universität Paderborn)
    In the data analysis community, many recent methods are
    based on the so-called Empirical Mode Decomposition (EMD), and
    different methods have been proposed to decompose non-linear and
    non-stationary signals sampled on non-uniform grids effectively. The
    traditional EMD employs a cubic spline method to interpolate the
    envelope based on the extrema of the data. This method may cause
    over/under-shootings which is a fundamental drawback since it may
    destroy some physical properties of the intrinsic mode functions. In
    order to generate strictly mathematically defined envelope, we propose
    an optimization-based empirical mode decomposition (OEMD).

    We demonstrate how to extend our OEMD method from
    one-dimensional signals to multi-dimensional data. We illustrate with
    several numerical examples that our method is superior over others
    with respect to different criteria like relative errors or
    the extraction of texture.

    Furthermore, we employ our optimization-based interpolation in
    normalization-based instantaneous frequency analyses which show its
    potential especially for non-uniform sampled data.
  • Using 2D-EEMD to understand African Easterly Waves and their role in initiation and development of tropical storms/hurricanes.
    Norden Huang (National Central University)Man-Li Wu (National Aeronautics and Space Administration (NASA))
    We are using the 2D-EEMD to increase our understanding of the
    relationships between the
    African Easterly Waves and the initiation and development of the
    tropical storms/hurricanes over the
    Northern Atlantic Ocean.

    We are using large scale parameters including zonal and meridional wind,
    sea surface temperature,
    atmospheric stability parameters, ocean heat capacity, relative
    humidity, low level vorticity, and
    vertical wind shear to carry out our studies. We will focus on case
    studies during July, August, and
    September of 2005 and 2006.

    by Man-Li C. Wu (1), Siegfried D. Schubert (2), and
    Norden E. Huang (3)

    (1) and (2) NASA/GSFC/GMAO
    (3) NCU, Taiwan.
  • Toward a Real-Time

    Implementation of Adaptive and Automated Digital Image Analysis

    Nii Attoh-Okine (University of Delaware)
    Empirical Mode Decomposition is a multi-resolution data analysis technique that can break down a signal or image into different time-frequency modes which uniquely reflect the variations in the signal or image. The algorithm has gained much attention lately due its performance in a number of applications (especially in climate and biomedical data analysis).

    Recently, civil infrastructure managers have begun exploring the potential application of the algorithm to automate the process of detecting cracks in infrastructure images. Unfortunately, the adaptive nature of the algorithm increases its computation cost to an extent that limits a wide practical application of the algorithm.

    The approach involves four main steps: Extrema detection, Interpolation, Sifting and Reconstruction. Extrema detection and interpolation consumes about 70% of the computational time. Hence we focus on ways to implement these procedures in parallel by taking advantage of the Matlab Parallel Computing Toolbox.