Authors: Charles Colley, Huda Nassar, David Gleich
In Submission
Abstract: Tensor Kronecker products, the natural generalization of the matrix Kronecker product, are independently emerging in multiple research communities. Like its predecessor, the tensor generalization interleaves vector spaces into multilinear operators; A structure prime for implicit multiplication and factorization theorems. We present an unexpected matrix generalization which decouples the extremal eigenvectors of tensor Kronecker products. We explore the newly implied low rank structure in the network alignment algorithm TAME, a power method heuristic, to produce new algorithms which are quadratically faster (in the number of motifs), improve or maintain accuracy, and scale to larger motifs. We demonstrate how the network embeddings produced by TAME, can be properly leveraged to post process its matchings to align more triangles and edges than previously achieved with original b-matching local search method. Our improved software shows a minimum 10 fold runtime improvement on the largest alignments originally conducted and our new method Lambda-TAME is capable of scalably aligning cliques larger than previously possible with implicit contraction theorems.
Authors: C. Colley, J. Lin, X. Hu, and S. Aeron
Venue: Graph Algorithms Building Blocks, 31st IEEE International Parallel and Distributed Processing Symposium
Abstract: Least squares problems on graphs are a large subclass of numerical linear algebra problems that may arise from many different applications, e.g., ranking problems, distributed clock synchronization, ranking, and arbitrage detection. Solutions to these problems are very practical and as scalability becomes a prevalent issue, researchers are working to identify algorithms most appropriate to solve these least squares problem. A notable endeavor is the work of Hirani et al. [1] where they provide an analysis of a variety of solvers on problems generated by random graphs. One result the group found was that smoothed aggregation algebraic multigrid (SA-AMG) method was unfit for solving these problems. In this work we investigate the effectiveness of the unsmoothed aggregation AMG (UA-AMG) method and show it is more appropriate for graph-related problems due to its ability to maintain the structure of graphs in the multilevel hierarchy. We present an analysis of the two-level algorithms for solving graph Laplacians least squares problems showing that the convergence rate is not tied to the condition number of the matrix. We then apply UA-AMG as a preconditioner for conjugate gradient (CG) method to achieve better performance, providing experiments comparing against other iterative methods on a collection of larger random graphs and a collection of real world network topologies to demonstrate the effectiveness of UA-AMG methods for solving least squares problems on graphs.
Our research group has been working to expand the collection of graphs available to experiment with. Preprocessed graphs are available along with the raw data and our drivers to generate the networks. Some networks we've built include Where's Waldo maps encoded with Sift Features, Facebook county level friend networks, Julia package dependency networks, and director/writer IMDb collaboration networks.
A collection of Julia graph matching algorithms which leverage the presence of motifs within the network to find global vertex matchings between networks. Building off of the work of TAME, we use our tensor Kronecker theory to produce LowRankTAME and LambdaTAME which provide high quality matchings an order of magnitude and quadractically faster, respectively.
An experimental package being developed as part of my PhD research, which implements a sparse symmetric tensor type which focuses on fast tensor vec contraction operations.
A Python class for third order tensors tensor with efficient t-product multiplications. This software is designed to be user friendly and offer mathematicians a tool to investigate new algorithms using the t-product.
Title: Addressing Bottlenecks in Algorithms with Tensor Kronecker Product Structure
Abstract: Graph matching, also known as network alignment, is the problem of finding a correspondence between the vertices of two separate graphs. An emerging trend in the graph matching community is to consider using higher order structures, known as motifs, rather than edges to align networks. One class of techniques is based on tensor Kronecker products and tensor eigenvectors. A challenge with these techniques are the memory and computational demands that are quadratic (or worse) in terms of the number of motifs. In our work, we present and apply a theory of tensor Kronecker products to tensor based network alignment algorithms to reduce their runtime complexity from quadratic to linear with no appreciable loss of quality.
Title: Kronecker Products, Tensors, Notation, and Software
Abstract: In this talk we'll discuss our work investigating improvements to the graph alignment algorithm TAME, using our new symmetric tensor software in Julia. By generalizing the multi-linear Kronecker mixed product property we've found a surprising proof allowing us to decouple extremal tensor eigenvectors. This allows us to reduce the computational complexity from the product of tensor non-zeros to the sum of tensor non-zeros which opens the door to multiple hypergraph alignment routines. We will also discuss the notation we've developed for expressing tensor contractions which not only lets us succinctly express our results, but also simplifies other common operations found in tensor algebra.
Title: AMG for Least Squares Problems on Graphs
This talk comes from a lecture I gave as part of an educational series established by members on the Amazon Alexa team. The talk covers the basics of moving beyond pairwise methods for analyzing natural language processing data to tensors. We discuss some basic decompositions and some of the bizarre behaviors of tensors we can encounter.