Explore UAB

Mathematics Fast Track Program LEARN MORE

The story of the little ell one norm and its friends

When

March 29, 2019 | 2:30 - 3:30 p.m.

Where

Campbell Hall 443

Speaker

Dr. Carmeliza Navasca

Abstract

The popularity of sparse ell one norm optimization problem was due to Emmanuel Candes and Terrence Tao via compressed sensing. I will start by introducing the little ell one norm and its minimization. Then, I will describe how and why these sparse optimization problems are useful in solving today’s challenging problems in data science and machine learning. Numerical examples in foreground and background separation in surveillance videos, matrix and tensor completion as well as deep neural network for image classification are included. In this talk, one can observe the interplay of (multi)linear algebra, optimization and numerical analysis with applications in computer science.

This is joint work with Xiaofei Wang (former postdoc at UAB, now Prof at Normal University, China), Ramin Karim Goudarzi, Fatou Sanogo, Ali Fry (former Fast-Track) and Da Yan (CS Prof at UAB).