Skip to content

Commit 9f1a549

Browse files
committed
fix formatting
1 parent 97f64e1 commit 9f1a549

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

docs/src/optimization.md

+1-2
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,7 @@ PolynomialAveraging
2626
[^DCAMHV2020]: Dhaka, A. K., Catalina, A., Andersen, M. R., Magnusson, M., Huggins, J., & Vehtari, A. (2020). Robust, accurate stochastic optimization for variational inference. Advances in Neural Information Processing Systems, 33, 10961-10973.
2727
[^KMJ2024]: Khaled, A., Mishchenko, K., & Jin, C. (2023). Dowg unleashed: An efficient universal parameter-free gradient descent method. Advances in Neural Information Processing Systems, 36, 6748-6769.
2828
[^IHC2023]: Ivgi, M., Hinder, O., & Carmon, Y. (2023). Dog is sgd's best friend: A parameter-free dynamic step size schedule. In International Conference on Machine Learning (pp. 14465-14499). PMLR.
29-
30-
## Operators
29+
## Operators
3130

3231
Depending on the variational family, variational objective, and optimization strategy, it might be necessary to modify the variational parameters after performing a gradient-based update.
3332
For this, an operator acting on the parameters can be supplied via the `operator` keyword argument of `optimize`.

0 commit comments

Comments
 (0)