**Title: **On Optimal Frame Conditioners

**Speaker: **Chae Clark (UMD)

A (unit norm) frame is scalable if its vectors can be rescaled so as to result into a tight frame. Tight frames can be considered optimally conditioned because the condition number of their frame operators is unity. In this paper we reformulate the scalability problem as a convex optimization question. In particular, we present examples of various formulations of the problem along with numerical results obtained by using our methods on randomly generated frames.

**Title: **Graph Sparsification

**Speaker: **Matt Begue (UMD)

Many data sets can be represented in the form of graphs. Matrices, such as the Laplacian or the adjacency matrix, capture the structure of the data graph and we can exploit spectral properties of these matrices to learn more about the graph and enable us to do harmonic analysis on graphs. However, graphs with many edges will have very dense Laplacians and this can become computationally expensive to work with, especially with large graphs. This lead to Spielmann and collaborators to introduce graph sparsification, where edges of the graph are deleted while trying to preserve the spectral structure of the Laplacian. Deleting edges makes the Laplacian more sparse which speeds up computations. We will present Spielman's most recent sparsification technique and analyze its performance, strengths, and weaknesses.