Article Source
Kolmogorov-Arnold Networks VS Regular Deep Learning
Abstract
MASSIVE idea proposed in this paper.
Kolmogorov-Arnold Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs) for approximating nonlinear functions 🤯.
📌 Moving activation functions from nodes (neurons) to edges (weights)!
📌 MLPs place activation functions on neurons, but can we instead place (learnable) activation functions on weights? Yes, we KAN!