Skip to content

Commit

Permalink
replace figure
Browse files Browse the repository at this point in the history
  • Loading branch information
TimRoith committed Jun 17, 2024
1 parent 04d9302 commit 433c2aa
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ where $\Delta t > 0$ is the _step size_ and $\xi^i \sim \mathcal{N}(0,\mathrm{Id

As a particle-based family of methods, CBX is conceptually related to other optimisation approaches which take inspiration from biology, like _particle-swarm optimisation_ (PSO) [@kennedy1995particle], from physics, like _simulated annealing_ (SA) [@henderson2003theory], or from other heuristics [@mohan2012survey;@karaboga2014comprehensive;@yang2009firefly;@bayraktar2013wind]. However, unlike many such methods, CBX has been designed to be compatible with rigorous convergence analysis at the mean-field level (the infinite-particle limit, see [@huang2021MFLCBO]). Many convergence results have been shown, whether in the original formulation [@carrillo2018analytical;@fornasier2021consensus], for CBO with anisotropic noise [@carrillo2021consensus;@fornasier2021convergence], with memory effects [@riedl2022leveraging], with truncated noise [@fornasier2023consensus], for polarised CBO [@bungert2022polarized], and PSO [@qiu2022PSOconvergence]. The relation between CBO and _stochastic gradient descent_ has been recently established by @riedl2023gradient, which suggests a previously unknown yet fundamental connection between derivative-free and gradient-based approaches.

![Typical evolution of a CBO method minimising the Ackley function [@ackley2012connectionist].](JOSS.png){ width=100% }
![Typical evolution of a CBO method minimising the Ackley function [@ackley2012connectionist].](JOSS.pdf){ width=100% }

CBX methods have been successfully applied and extended to several different settings, such as constrained optimisation problems [@fornasier2020consensus_sphere_convergence;@borghi2021constrained], multi-objective optimisation [@borghi2022adaptive;@klamroth2022consensus], saddle-point problems [@huang2022consensus], federated learning tasks [@carrillo2023fedcbo], uncertainty quantification [@althaus2023consensus], or sampling [@carrillo2022consensus].

Expand Down

0 comments on commit 433c2aa

Please sign in to comment.