@@ -68,10 +68,13 @@ by chaining several differentiable and invertible transformations. However,
68
68
these diffeomorphic transformations limit the flows in their complexity as such
69
69
have to be simple. Furthermore, this leads to trade-off sampling speed and
70
70
evaluation performance <d-cite key =" papamakarios_normalizing_2019 " ></d-cite >.
71
+ Their continuous couterpart, Continuous Normalizing Flows (CNFs) have been held
72
+ back by limitations in their simulation-based maximum likelihood training
73
+ <d-cite key =" tong_improving_2023 " ></d-cite >.
71
74
72
75
# Continuous Normalizing Flows
73
76
74
- Continuous normalizing flows (CNFs) are among the first applications of neural
77
+ Continuous normalizing flows are among the first applications of neural
75
78
ordinary differential equations (ODEs) <d-cite key =" chen_neural_2018 " ></d-cite >.
76
79
Instead of the traditional layers of neural networks, the flow is defined by a
77
80
vector field that is integrated over time.
@@ -236,7 +239,11 @@ convergence and better results.
236
239
237
240
## Generalized Flow-Based Models
238
241
239
- todo. ...
242
+ Flow matching, as it is described above, is limited to the Gaussian source
243
+ distributions. In order to allow for arbitrary base distributions <d-cite
244
+ key="tong_improving_2023"></d-cite > extended the approach to a generalized
245
+ conditional flow matching technique which are a family of simulation-free
246
+ training objectives for CNFs.
240
247
241
248
# Empirical results
242
249
0 commit comments