sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

596
active users

Samuel Vaiter

What happens when you use and let your nonsmooth iterative algorithm goes to convergence?

With J. Bolte & E. Pauwels, we show that under a contraction assumption, the derivatives of the algorithm converge linearly!
Preprint: arxiv.org/abs/2206.00457

I will present this work this week at

This is not trivial because:

1. We consider *nonsmooth* derivation that requires concepts such as conservative Jacobians

2. Convergence of functions do not imply convergence of derivatives in general

3. It could be not observed in practice

But it also works in practice! For instance, differentiating forward-backward for the Lasso or Douglas-Rachford for Sparse inv covar select leads to a linear rate. In fact, one can prove that we have linear convergence as soon as we have enough regularity

Unfortunately, we also show that for some pathological cases, automatic differentiation of momentum methods might diverge, even in 1D.

The proofs are based on a Banach-Picard fixed-point theorem for set-valued functions, along with some previous results of Jerome and Edouard on conservative Jacobians.

PS: This is *not* a result on implicit differentiation, already covered in arxiv.org/abs/2106.04350