October 28 - November 1, 2013
Suppose that ball-shaped sensors wander in a bounded and contractible domain. A sensor doesn't know its location but does know when it overlaps a nearby sensor. We say that an evasion path exists in this sensor network if a moving evader can avoid detection. Vin de Silva and Robert Ghrist give a necessary condition, depending only on the time-varying connectivity graph of the sensors, for an evasion path to exist. Can we sharpen this result? We show that the existence of an evasion path depends not only on the fibrewise homotopy type of the region covered by sensors but also on its embedding in spacetime. Furthermore, we study the entire space of evasion paths using a homotopy spectral sequence for diagrams of spaces.
Keywords of the presentation: numerical analysis, finite element, Hodge theory, de Rham cohomology
This talk will discuss a substantial interplay of algebraic topology with numerical analysis which has developed over the last decade. During this period, de Rham cohomology and the Hodge theory of Riemannian manifolds have come to play a crucial role in the development and understanding of computational algorithms for the solution of problems in partial differential equations. Hodge theory is at the foundation of the well-posedness of many important problems in partial differential equations, but only recently has it been understood how the stable numerical solution of PDE problems often depends on capturing the correct topological structures at the discrete level. This may be accomplished by constructing subcomplexes of the de Rham complex which consist of finite element differential forms. These latter have become a key technology in scientific computation, but first appeared in algebraic topology in the works of Whitney and Sullivan half a century ago. Interactions of topology and numerical analysis are not just applications of the former to the latter, but also occur in the reverse direction. Recently, the theory of superconvergence developed for finite element methods was used to settle a question concerning the combinatorial codifferential posed by Dodziuk and Patodi in 1976.
Keywords of the presentation: configuration spaces, Betti numbers, generating functions
I will discuss some recent results on the (co)homology structure of the "exotic configuration spaces": the spaces of particles where (k+1) particle are not allowed to occupy the same position, but k can collide (no-k-equal spaces); or the version with particles of several kinds.
Read More...
Methods for grouping data based on eigenvectors of the graph Laplacian have gained significant popularity due to their
strong empirical performance and amenability to theoretical analysis.
However, while algorithms for bi-partitioning are very simple and well-understood,
the situation with multi-way clustering is more complicated. Typical methods rely on k-means clustering or similar methods
whose performance significantly depends on initialization.
In this talk I will take a look at the problem of spectral clustering from a different angle,
by using ideas of geometric recovery related to those of Independent Component Analysis.
It turns out that due to the special nature of the clustering problem we can provide a large set of
"constrast function" with theoretical guarantees for clustering.
I will discuss the theoretical results, practical algorithms and some applications.
Joint work with Luis Rademacher and James Voss.
Keywords of the presentation: discrete differential geometry, vector field design, functional operators
A fundamental task in discrete differential geometry is the choice of representation of differential quantities. Different representations will lead to different optimization problems when using DDG in practice, and can considerably influence the scope of feasible applications.
We will describe a novel choice of representation, based on functional operators, which although standard in classic differential geometry has only been recently applied to DDG.
We will show how operators which take scalar functions to scalar functions can be used to concisely represent smooth maps between shapes and smooth vector fields. We further demonstrate that by using a common operator representation, the intimate connection between maps and vector fields can be leveraged to easily compute vector fields which fulfill intricate global constraints, such as Killing and symmetric vector fields. Finally, we show how the spectral decomposition of the Laplace-deRham operator can be leveraged for combining local and global vector field constraints in a unified framework.
Image segmentation is to label a region of the image as an object. However, thin structures are often missed due to noise. We present a method to capture such structures using a priori information of the object topology. Based on persistent homology, our method also provides a confidence measure of whether the restored structures are from the true signal. We apply the method to vision, graphics and medical imaging.
This Fall 2013 I am teaching a free online course MATH:7450 (22M:305)
Topics in Topology: Scientific and Engineering Applications of
Algebraic Topology offered through the Mathematics Department and
Division of Continuing Education at University of Iowa.
Goal: To prepare students and other researchers for the IMA Thematic
Year on Scientific and Engineering Applications of Algebraic Topology,
but all interested participants are welcome
Target Audience: Anyone interested in topological data analysis
including graduate students, faculty, industrial researchers in
bioinformatics, biology, business, computer science, cosmology,
engineering, imaging, mathematics, neurology, physics, statistics,
etc.
If you are interested in helping to teach a similar course in the
spring, please let me know.
More information about the Fall 2013 course can be found at
www.math.uiowa.edu/~idarcy/AppliedTopology.html
Keywords of the presentation: Homology basis, Rips complex, Point cloud, Sparsification
The efficiency of extracting topological information from point data depends largely on the complex that is built on top of the data points. From a computational viewpoint, the most favored complexes for this purpose have so far been Vietoris-Rips and witness complexes. While the Vietoris-Rips complex
is simple to compute and is a good vehicle for extracting topology of sampled spaces, its size is huge--particularly in high dimensions. The witness complex on the other hand enjoys a smaller size because of a subsampling, but fails to capture the topology in high dimensions unless imposed with extra structures.
We investigate a complex called the {em graph induced complex} that, to some extent, enjoys the advantages of both. It works on a subsample but still retains the power of capturing the topology as the Vietoris-Rips complex. It only needs a graph connecting the original sample points from which it builds a
complex on the subsample thus taming the size considerably. We show that, using the graph induced complex one can (i) infer the one dimensional homology of a manifold from a very lean subsample, (ii) reconstruct a surface in three dimension from a sparse subsample without computing Delaunay triangulations, (iii) infer the persistent homology groups of compact sets from a sufficiently
dense sample. We provide experimental evidences in support of our theory.
Read More...
A special family of non-trivial loops on a surface called handle and tunnel loops associates closely to geometric features of “handles” and “tunnels” respectively in a 3D model. The identification of these handle and tunnel loops can benefit a broad range of applications. Many of the existing efficient algorithms for computing non-trivial loops cannot be used to compute these special type of loops. The two algorithms known for computing handle and tunnel loops provably have a serious drawback that they both require a tessellation of the interior and exterior spaces bounded by the surface, which is both hard and expensive. We present an efficient algorithm to compute a basis for handle and tunnel loops without requiring any 3D tessellation. This saves time considerably for large meshes making the algorithm
scalable while computing the loops on the original input mesh and not on some refined version of it. We use the concept of the Reeb graph which together with several key theoretical insights on linking number provide an initial set of loops that provably constitute a handle and a tunnel basis.
This work is done by Tamal Dey, Fengtao Fan and Yusu Wang.
In the middle of '70 Tonti (following the works of Branin and Kron
from '50 and '60) discovered that the discrete counterparts of
physical laws of electromagnetism can be written up by using the
language of algebraic topology. They involve cochains on a pair of
cell complexes, one dual to the other, and (co)boundary operators.
As a result a system of algebraic equations instead of a system of
standard PDE's is obtained. Therefore, for a given complex, no
discretization step is needed and physical laws are enforced exactly
in the given mesh.
The discrete counterparts of the laws of electromagnetism are usually
enforced locally, for each cell of the complex. Therefore, there
exists the problem of ensuring that the laws still hold considering an
arbitrary collection of cells. In some of the standard formulations,
this trivially holds as nonlocal laws are a linear combination of
local laws. However, this does not hold in case of the numerically
efficient scalar potential based method, which requires the
information about topology of the insulating region to be computed in
form of the so called cuts.
Engineering community spent more than two decades trying to define and
compute the cuts in a simple and fast way. They did not succeed. Only
recently we shown that the cuts are a first cohomology basis of the
insulating subcomplex. In this poster we will give an idea of the
considered formulation and explain why cohomology generators are
needed to make it work. Later we will present the ideas we use to
efficiently compute the cohomology basis for industrial sized meshes.
This enables to effectively solving what has been considered an open
problem for more than twenty-five years.
Keywords of the presentation: Combinatorial Laplacian, maximum principle, Harnack inequality, Cheeger inequality
I will review several examples where techniques motivated by ideas in
Partial Differential Equations and Differential Geometry lead to interesting
results about difference equations on graphs, most notably the combinatorial
Laplacian. Examples will include the maximum principle, Harnack inequality,
Cheeger's inequality and a recent result about surjectivity of combinatorial
Laplacian for infinite graphs.
Keywords of the presentation: Debye source, Maxwell Equations, Lagrangian subspace of H^1, Fredholm equation
Using Hodge theory for a surface, we obtain a new representation for solutions to the time harmonic Maxwell equations in both bounded and unbounded domains of R^3. This representation has no spurious resonances, does not suffer from low frequency breakdown, and reduces standard boundary value problems to Fredholm equations of second kind. There are many interactions between the topology of the boundary and aspects of this representation. This representation also provides a method to find force free, Beltrami fields, and the spectra for a family of self adjoint boundary problems for the curl operator parametrized by Lagrangian subspaces if the first cohomology group of the boundary.
Read More...
Keywords of the presentation: Harmonic measure, sensor networks, source location privacy
Random walk on a graph is a Markov chain and thus is ‘memoryless’ as the next node to visit depends only on the current node and not on the sequence of events that
preceded it. With these properties, random walk and its many variations have been used in network routing to ‘randomize’ the trafﬁc pattern and hide the location of the data sources. In this paper we examine a myth in common understanding of the memoryless property of a random walk applied for protecting source location privacy in a wireless sensor network. In particular, if one monitors only the network boundary and records the ﬁrst boundary node hit by a random walk, this
distribution can be related to the location of the source node. For the scenario of a single data source, a very simple algorithm by integrating along the network boundary would reveal the location of the source. We also develop a generic algorithm to reconstruct the source locations for various sources that have simple descriptions (e.g., k source locations, sources on a line segment, sources in a disk). This represents a new type of trafﬁc analysis attack for invading sensor data location privacy and essentially re-opens the problem for further examination.
Read More...
Correlations in neural spiking activity are an important tool for understanding neuronal populations. Pairwise correlations can be organized into graphs, with edges reflecting the levels of correlation between pairs of neurons. In brain areas with convex receptive fields or place fields, the structure of the neural code has strong implications for the structure of cliques in these graphs. For example, for hippocampal place cell activity during spatial exploration, pairwise correlation graphs are expected to have low-dimensional topology, due to the arrangement of receptive fields in a low-dimensional environment. However, during more general activities, there is no reason to expect low-dimensional topology to appear in network activity. Using topological data analysis we found that, remarkably, under a variety of behavioral conditions (open field exploration, wheel running, and sleep) we found hippocampal correlation graphs to carry low dimensional topology by comparing their clique structure to that of geometric random graphs with matching parameters. This suggests that neural activity in hippocampus codes intrinsically low-dimensional structures.
Reproducing kernel Hilbert spaces (RKHS) have recently emerged as a powerful mathematical framework for many problems in machine learning, statistics, and their applications.
This poster is based on work presented at the International Conference on Machine Learning (ICML 2013). It gives a formulation in vector-valued RKHS that provides a unified treatment of several important machine learning approaches. Two of these are: (i) vector-valued Manifold Regularization, which seeks to exploit the geometry of the input data via unlabeled examples using vector-valued graph Laplacian, and (ii) Multi-view Learning, which attempts to integrate different features and modalities in the input data. The mathematical formulation will be accompanied by numerical results on several challenging multi-class classification problems.
In this talk I will show how tools from algebraic topology, specifically homology and cohomology theory, can be used to study a range of problems in sensor networks from distributed, location-free coverage verification to distributed estimation based on pairwise relative measurements. I will show that finding sparse homology generators correspond to "location" of coverage holes, even when coordinates are not available. Furthermore, I will show that sparse cohomology generators can be used to detect outliers when one needs to aggregate noisy relative measurements for distributed estimation. Next, I will show how Hodge Theory and combinatorial Laplacians can be used to define random walks and Dirichlet problems on simplicial complexes, resulting in introduction of a simplciial Page Rank for measuring importance of higher order cells.
We describe methods for finding any discrete harmonic forms and those forms in a specified cohomology class (cohomologous to a given element). We demonstrate the eigenvector method for finding a basis for the space of discrete harmonic forms. For cohomologous harmonic forms, we illustrate our work in formulating a weighted least squares method by proving a discrete Hodge-deRham theorem on the isomorphism between the space of harmonic cochains and cohomology. We also outline two projection methods that given a harmonic basis can project it to the required cohomology class. However, they either require representative elements from all cohomology classes or from all homology classes. These discrete harmonics forms have potential applications in the solution of partial differential equations in exterior calculus, computational topology and computer graphics. We show an application in finite element exterior calculus and data analysis. Joint work with Anil N. Hirani, Seth Watts and Han Wang.
Keywords of the presentation: Stochastic, average currents, Kirchhoff's network theorem, master equation, homology
Abstract: An area of interest in statistical mechanics is the study of
statistical distributions of stochastic currents generated in graphs.
It turns out that this problem amounts to the study of periodic families
of Markov processes that evolve according to the "master equation."
Physicists have observed that, for almost every generated current,
quantization occurs in the "adiabatic" and "low temperature" limits.
My main goal in this talk will be to explain how this story can be understood using
the standard tools of algebraic topology.
Let U be a compact, negatively curved surface. From the
(finite) set of all closed geodesics on U of length less than L
choose one, and let N (L) be the number of its self-intersections.
There is a positive constant $K$ such that with overwhelming
probability as L grows,
N (L)/L^{2} approaches K.
This talk will concern itself with ``fluctuations''. The main theorem
states that if U has variable negative curvature then
(N (L)-KL^{2})/L^{3/2}
converges in distribution to a Gaussian law,
but if U has constant negative curvature then
(N (L)-KL^{2})/L
converges in distribution to a (probably) non-Gaussian law.
Keywords of the presentation: Nonlocal operators; vector calculus; volume-constrained problems; balance laws; peridynamic mechanics; nonlocal diffusion
A vector calculus for nonlocal operators is developed, including the definition of nonlocal divergence, gradient, and curl operators and the derivation of the corresponding adjoint operators. Nonlocal analogs of several theorems and identities of the vector calculus for differential operators are also presented. Relationships between the nonlocal operators and their differential counterparts are established, first in a distributional sense and then in a weak sense by considering weighted integrals of the nonlocal adjoint operators. The operators of the nonlocal calculus are used to define volume-constrained problems that are analogous to elliptic boundary-value problems for differential operators; this is demonstrated via some examples. Another application discussed is the posing of abstract nonlocal balance laws and deriving the corresponding nonlocal field equations; this is demonstrated for heat conduction and the peridynamic mechanics model.
Keywords of the presentation: Finite-Element Exterior Calculus, DIscrete Variational Mechanics, Multisymplectic Field Theories
Many gauge field theories can be described using a multisymplectic Lagrangian formulation, where the Lagrangian density involves space-time differential forms. While there has been much work on finite-element exterior calculus for spatial and tensor product space-time domains, there has been less done from the perspective of space-time simplicial complexes. One critical aspect is that the Hodge star is now taken with respect to a pseudo-Riemannian metric, and this is most naturally expressed in space-time adapted coordinates, as opposed to the barycentric coordinates that Whitney forms (and their higher-degree generalizations) are typically expressed in terms of.
We introduce a novel characterization of Whitney forms and their Hodge dual with respect to a pseudo-Riemannian metric that is independent of the choice of coordinates, and then apply it to a variational discretization of the covariant formulation of Maxwell's equations. Since the Lagrangian density for this is expressed in terms of the exterior derivative of the four-potential, the use of finite-dimensional function spaces that respects the de Rham cohomology results in a discretization that inherits the gauge symmetries of the continuous problem. This then yields a variational discretization that exhibits a discrete Noether's theorem, which implies that an associated multi-momentum is automatically conserved by the discretization.
Keywords of the presentation: random walks, Cheeger ineqality, simplicial complexes
I will discuss random walks on simplicial complexes, Cheeger inequalities,
and mixing times. The talk will summarize the results of two papers that explore
how random walks on walks graphs extend to simplicial complexes. I will also discus two papers that study higher order Laplacians and expander properties.
Two possible applications in machine learning will be discussed: spectral clustering and label (edge) propogation.
joint work with John Steenbergen
Keywords of the presentation: game theory, potential games, harmonic/potential decomposition
Potential games are a special class of games that allow for tractable dynamic analysis. Intuitively, games that are "close" to a potential game should share similar properties. In this talk, we formalize and develop this idea by quantifying how the dynamical features of potential games extend to near-potential games. Towards this goal, we discuss a novel flow representation of arbitrary finite games as a canonical direct sum decomposition into three components, which we refer to as the potential, harmonic and nonstrategic components.
Our results show that the limiting behavior of better-response and best-response dynamics can be characterized in terms of the approximate equilibrium set of a close potential game. Moreover, the size of this set is proportional to a closeness measure between the original game and the potential game. Analogous results are also shown to hold for other dynamical processes, such as logit response. Our approach presents a systematic framework for studying convergence behavior of adaptive learning dynamics in finite strategic form games.
Joint work with Ozan Candogan (Duke) and Asu Ozdaglar (MIT).
Read More...
We introduce a novel method to detect particular non-local structures, akin to weighted holes within the link-weight network fabric, invisible to existing methods. These divide weighted networks in two broad classes: one characterized by small hierarchically nested holes, and a second displaying larger and longer living inhomogeneities. These classes cannot be reduced to known local network properties, due of the mesoscopic nature of homological properties. Finally, we show that the classification finds a correlate in the adjacency and Laplacian spectral gaps, linking topological and dynamical network properties. This is work with M. Scolamiero, I. Donato and F. Vaccarion ( PLoS ONE 8(6): e66506.)
This work deals with combinatorial structures installed on cell complexes, allowing fast computation for a great variety of algebraic topological invariants. Within a cell or digital context and using some relevant examples up to four dimension, we show some algorithms for computing a Homological Spanning “Forest” (or HSF, for short) for an object. Such graph-based structure is a kind of topological “dense” skeleton, installed on the 1-skeleton of the first subdivision of the cell complex. From such a sort of “veinerization”, it is possible to compute different algebraic topological invariants: not only Betti number and representative elements of homology generators but also advanced algebraic relations between homology generators (cup-product, (co)homology operations, A(infty)-coalgebra structure of the homology,….). HSF-structures are a well adapted to homotopical analysis and to control geometric evolution of curves and surfaces within the object. Finally, a theoretical framework for digital image topological processing based on these new concepts is proposed.
Keywords of the presentation: Hodge theory, cohomology, scale, metric space
Hodge theory is a beautiful synthesis of geometry, topology, and analysis which has been developed in the setting of Riemannian manifolds. However, many spaces important in applications do not fit this framework. This motivates us to develop a version of Hodge theory on metric spaces with a probability measure.
The goal here is to obtain a theory which is sensitive to features which are present at a given scale. Our approach is based on an Alexander-Spanier replacement of differential forms, with a localization with parameter a.
We discuss that, unfortunately, for completely general metric spaces the theory does not have all nice analytic features one would hope to have.
On the other hand, as the main test case, we provide conditions for the local structure of the metric measure space which implies that the a-scale cohomology is isomorphic with ordinary
homology. In particular, this applies to Riemannian manifolds (if we choose the scale small enough in terms of the injectivity radius and the sectional curvature).
A planar dumbbell shaped domain splits into two components if the connecting "neck" vanishes. This is reflected in the eigenvalues of the scalar Laplacian on such domains. On the original dumbbell the scalar Laplacian has one 0 eigenvalue, however without the neck it has two. Calabi, Cheeger, and Buser studied what happens as the neck becomes thinner. Buser showed that the smallest nonzero eigenvalue is bounded above by a polynomial in the Cheeger constant which is proportional to the width of the "neck" (as the neck becomes small). Thus the smallest nonzero eigenvalue "anticipates" the arrival of a topological change (one component changing to two). We create a solid dumbbell by rotating the planar dumbbell and observe that the smallest eigenvalue of the 1-Laplacian goes to zero as the middle is made thinner. Thus that eigenvalue "anticipates" the arrival of a different type of topological change (a solid ball changing to a solid torus). Furthermore, we find an upper bound on the rate that the eigenvalue approaches zero as thickness of the middle goes to zero. This work is done as part of an investigation into generalizing the Cheeger-Buser inequality for higher Laplacians. Joint work with Douglas Arnold, Anil N. Hirani, and Sayan Mukherjee.
Multipersistent Homology is a natural generalization of Persistent Homology introduced by G.Carlsson and A.Zomorodian. Using this method a point cloud can be analyzed through more than one filtration parameter. For example one can investigate a point cloud through ellipsoids instead of balls in the Vietoris-Rips complex. A filtration of simplicial complexes is substituted by a multifiltration, that is a family of simplicial complexes indexed by r- uples of natural numbers. Because of the algebraic structure underlying multipersistence, a complete discrete invariant generalizing the barcode is not possible. Up to now the most common computable invariant is the rank invariant. In this poster we will recast multidimensional persistence in a homological algebra setting. In particular, we describe multipersistence modules as compact objects in the category of functors from r-uples of natural numbers to vector spaces. We give a constructive way of building minimal resolutions and locally calculate multigraded Betti numbers.
Geometry as the study of observable symmetries and dynamical invariants is de facto the lingua franca of the Hamiltonian theories. The prevailing paradigm in modeling of the complex large-scale physical systems is network modeling. In many problems arising from modern science and engineering, such as multi-body systems, electrical networks and molecular dynamics, the port-based network modeling is a natural strategy of decomposing the overall system into subsystems, which are interconnected to each other through pairs of variables called ports and whose product is the power exchanged between the subsystems. The formalism that unifies the geometric Hamiltonian and the port-based network modeling is the port-Hamiltonian, which associates with the interconnection structure of the network a geometric structure given by a Poisson, or more generally, a Dirac structure.
In this poster I address (i) the issue of structure-preserving discretization of distributed-parameter port-Hamiltonian systems on bounded domains, and (ii) consider the avenues that lead towards more data-driven analysis of complex systems.
This poster presents recent research on the theoretical foundation for using Hodge Laplacians on simplicial complexes for data analysis. In the past, the graph Laplacian has been successfully used for clustering and dimension
reduction, with these applications motivated by certain theoretical
advances in spectral graph theory. Our main result provides a partial
theoretical foundation for understanding the spectrum of the highest-order
Hodge Laplacian on a simplicial complex. A simple algorithm is devised
and applied to a toy example to demonstrate how this result might be used
for data analysis.
In the numerical analysis of elliptic PDEs, much attention has been given (quite rightly) to the discretization of the Laplace operator and other second-order Laplace-type operators, e.g., the Hodge--Laplace operator on differential $k$-forms. By comparison, Dirac-type operators have received little attention from the perspective of numerical PDEs---despite being, in many ways, just as fundamental. Informally, a Dirac-type operator is a square root of some Laplace-type operator, and is therefore a first-order (rather than second-order) differential operator. The study of these operators is central to the field of Clifford analysis, where there has been growing interest in the discretization of Dirac-type operators. This talk introduces the abstract Hodge--Dirac operator, which is a square root of the abstract Hodge--Laplace operator arising in finite element exterior calculus. We prove stability and convergence estimates, and show that many of the results in finite element exterior calculus can be recovered as corollaries of these new estimates.
A discrete conformality for polyhedral metrics on surfaces is
introduced in this paper which generalizes earlier work on the subject.
It is shown that each polyhedral metric on a surface is discrete
conformal to a constant curvature polyhedral metric which is
unique up to scaling. Furthermore, the constant curvature metric
can be found using a discrete Yamabe flow with surgery.
We present a method for computing a set of bottleneck oops that describe the narrowing of the volumes inside/outside of the surface and extend the notion of surface homology and homotopy loops. The intuition behind their definition is that such a loop represents the boundary of a membrane within a given distance from the surface that separates the inside volume. Our definition is based on persistent homology theory, which gives a measure to topological structures, thus providing resilience to noise and a well-defined way to determine topological feature size. More precisely, the persistence computed here is based on the lower star filtration of the interior or exterior 3D domain with the distance field to the surface being the associated 3D Morse function.
with Johannes A. Stork, Florian T. Pokorny, and Danica Kragic (all from KTH Royal Institute of Technology)
Caging grasps of unknown objects is a difficult area in robotic grasping and motion planning. We use shortest homology generators to find likely grasp sites, and explore possible grasping plans using topological invariants to verify expected performance of a grasp attempt.
with
Primoz Skraba (Jozef Stefan Institute)
Vin de Silva (Pomona College)
Florian T. Pokorny (KTH Royal Institute of Technology)
We present techniques developed to analyze periodic, quasi-periodic and recurrent systems using circular coordinates computed using persistent cohomology. In particular, we present applications to weather data, to dynamical systems, and to human motion.
Keywords of the presentation: Reeb graphs, distance measure, Gromov-Hausdorff distance, stability
One of the prevailing ideas in geometric and topological data analysis is to provide descriptors that encode useful information about hidden objects from observed data. The Reeb graph is one such descriptor for a given scalar function. The Reeb graph provides a simple yet meaningful abstraction of the input domain, and can also be computed efficiently. Given the popularity of the Reeb graph in applications, it is important to understand its stability and robustness with respect to changes in the input function, as well as to be able to compare the Reeb graphs resulting from different functions.
In this paper, we propose a metric for Reeb graphs, called the functional distortion distance. Under this distance measure, the Reeb graph is stable against small changes of input functions. At the same time, it remains discriminative at differentiating input functions. In particular, the main result is that the functional distortion distance between two Reeb graphs is bounded from below by (and thus more discriminative than) the bottleneck distance between both the ordinary and extended persistence diagrams for appropriate dimensions.
In this talk, I will describe the functional distortion distance for Reeb graph, its properties, and an application based on these properties.
This is joint work with Ulrich Bauer and Xiaoyin Ge.
Keywords of the presentation: Persistent homology, cohomology, robustness, vector field, scientific visualization, software visualization
I will discuss several interesting applications of homology and cohomology in visualization. First, I will describe how the topological notion of robustness, a relative of persistence, could be applied in the analysis and visualization of vector fields. In particular, robustness allows us to quantify the stability of critical points, to investigate how the stability of a critical point evolves over time and to infer correspondences between features. It also leads to novel vector field simplification schemes in 2D and 3D. Second, I will discuss detecting circular and branching structures in high dimensions based on cohomology parametrization, and its application in software visualization.
Joint work with Primoz Skraba, Paul Rosen, Guoning Chen, Harsh Bhatia, Valerio Pascucci, Brian Summa, A.N.M. Imroz Choudhury and Mikael Vejdemo-Johansson.
Recently, we introduced Vector Diffusion Maps (VDM) and showed that the Connection Laplacian of the tangent bundle of the manifold can be approximated from random samples. In the first part of the talk, we will present a unified framework for approximating other Connection Laplacians over the manifold by considering its principle bundle structure. We prove that the eigenvectors and eigenvalues of these Laplacians converge in the limit of infinitely many random samples. Our results for spectral convergence also hold in the case where the data points are sampled from a non-uniform distribution, and for manifolds with and without boundary, and hence generalize the work by Belkin and Niyogi in 2007. We will show how VDM is applied to the ptychography problem.
Motivated by this technique and the inevitable noise in real data, In the second part of the talk, we further study matrices that are akin to the ones appearing in the null case of VDM, i.e the case where there is no structure in the dataset under investigation. Developing this understanding is important in making sense of the output of the VDM algorithm - whether there is signal or not. We develop a theory explaining the behavior of the spectral distribution of a large class of random matrices, in particular random matrices with random block entries with dependent ``random strip structure''. Numerical work shows that the agreement between our theoretical predictions and numerical simulations is generally very good.
This is a joint work with Amit Singer, Noureddine El Karoui, Stefano Marchesini, etc.
The problem of ranking based on pairwise comparisons, as an uni-dimensional scaling, is a fundamental problem which can be traced back at least to the 18th century. Recently there is a rapid growth of paired comparison data which are imbalanced, incomplete, and distributed on graphs, e.g. from the crowdsourcing experiments on Internet. It is important to monitor the quality of such data where the existence of outliers may cause instability to the inference of global ranking. To reach a robust ranking, in this paper a systematic study is carried out on a general linear model in the presence of sparse outliers and Gaussian-type noise. Various conditions and algorithms are presented with which outliers can be detected and the underlying global ranking function can be recovered, exactly or approximately. In particular, one can benet from the exploitation of Erdos-Renyi random graphs in crowdsourcing experiments to reach nearly optimal rates in the exact recovery of global ranking by LAD (L1) against sparse symmetric outliers, and help consistent outlier detections by LASSO against a mixture of sparse outliers and Gaussian noise. The validity of proposed methods is further conrmed by experimental studies on both simulated examples and real-world data.
We propose an online ranking/rating scheme based on stochastic approximation of HodgeRank on random graphs for Quality of Experience
(QoE) evaluation, where assessors and rating pairs enter the system in a sequential or streaming way.
Two particular types of random graphs are studied in detail, i.e., Erdos-Renyi random graph and preferential attachment random graph.
The former is the simplest I.I.D. (independent and identically distributed) sampling and the latter may
achieve more efficient performance in ranking the top-k items due to its Rich-get-Richer property.
We propose a novel framework, HodgeRank on Random Graphs, based on paired comparison, for subjective video quality assessment. Two types of random graph models are studied, i.e., Erdös–Rényi random graphs and random regular graphs. Hodge decomposition of paired comparison data may derive quality scores of videos and inconsistency of participants’ judgments. We demonstrate the effectiveness of the proposed framework on LIVE video database. Both of the two random designs are promising sampling methods without jeopardizing the accuracy of the results. In particular, due to balanced sampling, random regular graphs may achieve better performances when sampling rates are small. However, when the number of videos is large or when sampling rates are large,Erdös–Rényi random graphs could provide good approximations to random regular graphs.
Divergence functions, as a proximity measure on a smooth manifold and often surrogate to the (symmetric) metric function, play an important role in machine learning, statistical inference, optimization, etc. This talk will review the various geometric structures induced from a divergence function defined on a manifold. Most importantly, a Riemannian metric with a pair of torsion-free affine connections can be induced on the manifold, the so-called the “statistical structure” in Information Geometry. Additional structures may emerge depending on the functional form of the divergence. A general family of divergence functions can be constructed based on a smooth and strictly convex function (Zhang, 2004). Such divergence functions results in a manifold equipped with (i) a pair of bi-orthogonal coordinates, and therefore Hessian structure, reflecting “reference-representation biduality”; (ii) an equiaffine structure, so that parallel volume forms exist; (iii) a symplectic structure, so that Legendre maps are induced; (iv) a complex (and hence Kahler) structure, so that complex coordinates can be introduced for representing a pair of points on the manifold. Computational advantages of such divergence functions will be discussed.
This is a poster presented at IJCAI 2013, a major artificial intelligence conference, with the goal of introducing persistent homology to a broader audience including computer scientists. While the first part of the poster presents a tutorial on persistent homology, the second part contains one of the first applications of persistent homology to natural language processing. Specifically, our Similarity Filtration with Time Skeleton (SIFTS) algorithm identifies holes that can be interpreted as semantic ``tie-backs'' in a text document, providing a new document structure representation. We illustrate our algorithm on documents ranging from nursery rhymes to novels, and on a corpus with child and adolescent writings.