1
1
Changelog
2
2
=========
3
3
4
- Main branch
5
- -----------
4
+ Version 0.5.5
5
+ -------------
6
6
7
7
New features
8
8
~~~~~~~~~~~~
9
9
10
- - Added example :doc: `/notebooks/deep_learning/maml ` by Fabian Pedregosa based on initial code by Paul Vicol and Eric Jiang.
10
+ - Added MAML example by Fabian Pedregosa based on initial code by Paul Vicol and Eric Jiang.
11
+ - Added the possibility to stop LBFGS after a line search failure, by Zaccharie Ramzi.
12
+ - Added gamma to LBFGS state, by Zaccharie Ramzi.
13
+ - Added :class: `jaxopt.BFGS `, by Mathieu Blondel.
14
+ - Added value_and_grad option to all gradient-based solvers, by Mathieu Blondel.
15
+ - Added Fenchel-Young loss, by Quentin Berthet.
16
+ - Added :func: `projection_sparse_simplex <jaxopt.projection.projection_sparse_simplex> `, by Tianlin Liu.
17
+
18
+ Bug fixes and enhancements
19
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~
20
+
21
+ - Fixed missing args,kwargs in resnet example, by Louis Béthune.
22
+ - Corrected the implicit diff examples, by Zaccharie Ramzi.
23
+ - Small optimization in l2-regularized semi-dual OT, by Mathieu Blondel.
24
+ - Numerical stability improvements in :class: `jaxopt.LevenbergMarquardt `, by Amir Saadat.
25
+ - Dtype consistency in LBFGS, by Alex Botev.
11
26
12
27
Deprecations
13
28
~~~~~~~~~~~~
@@ -16,6 +31,11 @@ Deprecations
16
31
:class: `jaxopt.CvxpyQP `, :class: `jaxopt.OSQP `, :class: `jaxopt.BoxOSQP ` and
17
32
:class: `jaxopt.EqualityConstrainedQP ` instead.
18
33
34
+ Contributors
35
+ ~~~~~~~~~~~~
36
+
37
+ Alex Botev, Amir Saadat, Fabian Pedregosa, Louis Béthune, Mathieu Blondel, Quentin Berthet, Tianlin Liu, Zaccharie Ramzi.
38
+
19
39
Version 0.5
20
40
-----------
21
41
0 commit comments