-
-
Notifications
You must be signed in to change notification settings - Fork 5
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Signed-off-by: Daniel Bevenius <[email protected]>
- Loading branch information
Showing
1 changed file
with
17 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,17 @@ | ||
## Kolmogorov-Arnold Networks (KAN) | ||
Is bascially a suggestion to replace the multipli layer percpetron (MLP) in | ||
neural networks. In MPL we have weights that are learnable and non-linear | ||
functions in the neurons which are fixed (they can be different functions but | ||
they are not changed during training, they are not learnable only the weights | ||
are). In KAN there are no weights, instead they are replaced with univariant | ||
functions (functions with one independent variable) that are learnable. | ||
|
||
So what is the advantage of KAN over MLP? | ||
They can outperform MLP in terms of accuracy and interperprability | ||
(understanding what is happening in the network is my understanding of this). | ||
|
||
* Paper: https://arxiv.org/html/2404.19756v1 | ||
|
||
### Architecture | ||
TODO: | ||
|