Torrent details for "Benfenati A. Advanced Techniques in Optimization for ML and Imaging 2024 [andryold1]"    Log in to bookmark

wide
Torrent details
Cover
Download
Torrent rating (0 rated)
Controls:
Category:
Language:
English English
Total Size:
12.73 MB
Info Hash:
619666042e956ecb56c8e5d896f0fb817260201e
Added By:
Added:  
09-10-2024 14:05
Views:
55
Health:
Seeds:
21
Leechers:
0
Completed:
245
wide




Description
wide
Externally indexed torrent
If you are the original uploader, contact staff to have it moved to your account
Textbook in PDF format

In recent years, non-linear optimization has had a crucial role in the development of modern techniques at the interface of Machine Learning and imaging. The present book is a collection of recent contributions in the field of optimization, either revisiting consolidated ideas to provide formal theoretical guarantees or providing comparative numerical studies for challenging inverse problems in imaging. The covered topics include non-smooth optimisation techniques for model-driven variational regularization, fixed-point continuation algorithms and their theoretical analysis for selection strategies of the regularization parameter for linear inverse problems in imaging, different perspectives on Support Vector Machines trained via Majorization-Minimization methods, generalization of Bayesian statistical frameworks to imaging problems, and creation of benchmark datasets for testing new methods and algorithms.
In the past years, Support Vector Machines (SVMs) played a crucial role in the context of Machine Learning, for supervised classification and regression tasks. Even in the Deep Learning era, they can outperform other supervised methods and they are still a popular approach. The paper by A. Benfenati et al. investigates a novel approach by training SVMs via a squared hinge loss functional coupled with sparse-promoting regularization, adopting a Majorization-Minimization method.
We study the problem of approximate sampling from non-log-concave distributions, e.g., Gaussian mixtures, which is often challenging even in low dimensions due to their multimodality. We focus on performing this task via Markov chain Monte Carlo (MCMC) methods derived from discretizations of the overdamped Langevin diffusions, which are commonly known as Langevin Monte Carlo algorithms. Furthermore, we are also interested in two nonsmooth cases for which a large class of proximal MCMC methods have been developed: (i) a nonsmooth prior is considered with a Gaussian mixture likelihood (ii) a Laplacian mixture distribution. Such nonsmooth and non-log-concave sampling tasks arise from a wide range of applications to Bayesian inference and imaging inverse problems such as image deconvolution. We perform numerical simulations to compare the performance of most commonly used Langevin Monte Carlo algorithms. The Langevin Monte Carlo (LMC) algorithm (possibly with Metropolis–Hastings adjustment), which is derived from the overdamped Langevin diffusion, has become a popular MCMC method for high-dimensional continuously differentiable distributions since it only requires access to a gradient Oracle of the potential of the distribution, which can be computed easily using automatic differentiation softwares such as PyTorch, TensorFlow and JAX

  User comments    Sort newest first

No comments have been posted yet.



Post anonymous comment
  • Comments need intelligible text (not only emojis or meaningless drivel).
  • No upload requests, visit the forum or message the uploader for this.
  • Use common sense and try to stay on topic.

  • :) :( :D :P :-) B) 8o :? 8) ;) :-* :-( :| O:-D Party Pirates Yuk Facepalm :-@ :o) Pacman Shit Alien eyes Ass Warn Help Bad Love Joystick Boom Eggplant Floppy TV Ghost Note Msg


    CAPTCHA Image 

    Anonymous comments have a moderation delay and show up after 15 minutes