-
Notifications
You must be signed in to change notification settings - Fork 40
Description
Hey 👋
I think Wendland kernels are cool, and I would like to contribute a PR. Let me know what you think :)
What's this about?
Wendland kernels have compact support. If two points are "too far away" (quantified by the lengthscale of the kernel), their covariance under a Wendland kernel is zero. This results in sparse kernel matrices, which can be leveraged to save memory and compute. In particular, Julia has built-in support for highly optimised sparse Cholesky decompositions.
Within their support, the Wendland functions are defined by a rational polynomial, the coefficients of which can be computed in closed form.
At the same time, Wendland kernels have nice theoretical properties. In particular, much like the Matérn kernels, their smoothness is controllable directly through a smoothness parameter.
For more details, and in particular for the definition of Wendland functions, refer to Chapter 9 of Scattered Data Approximation by Holger Wendland.
What do I propose?
I would like to add the WendlandKernel
to KernelFunctions.jl. I already have an implementation locally for Wendland kernels with arbitrary space dimension
Difficulties
Currently, AbstractGPs.jl does not work with sparse Cholesky factorizations. I added some extensions locally, and I am not sure if these should go to KernelFunctions.jl or if I should make a separate PR for AbstractGPs.jl.