You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The neural operator is surprisingly absent in its fully continuous form. It can be implemented using Integrals.jl since that is differentiable. Unlike some of the other forms which kind of aren't operators since they just act on discrete points, that would be a truly continuous functional form. Might be slow, but hey it's cool and the namesake.
The text was updated successfully, but these errors were encountered:
I have tried to introduce continuous property by combining neural operators with neural ODE. But it failed. I am curious about how can we use Integrals.jl in neural operators.
https://arxiv.org/abs/2108.08481 equation 6. The completely general formulation is just a convolution defined by an integral between the input function and some network.
The neural operator is surprisingly absent in its fully continuous form. It can be implemented using Integrals.jl since that is differentiable. Unlike some of the other forms which kind of aren't operators since they just act on discrete points, that would be a truly continuous functional form. Might be slow, but hey it's cool and the namesake.
The text was updated successfully, but these errors were encountered: