FRACCN: Fractional spline neural networks

Main Article Content

Amal Bouaicha
Mohamed Lamnii
Soumia Ngadi

Abstract

We propose a new neural architecture inspired by the structural principles of Kolmogorov-Arnold theorems, in which fixed activation functions are replaced by learnable fractional splines. Unlike classical approaches, our model deals not only with weights and biases as trainable parameters, but also with fractional orders and shape parameters governing the activation functions themselves. The network's local regularity can be dynamically adapted during learning, which favors sharper responses in areas that need detailed sharpness, and smoother transitions where stability or generalization is of importance.We derive exact analytical gradients for all parameters including fractional orders, which makes possible end-to-end optimization via gradient-based methods. The theoretical foundation is based on the constructive properties of fractional spline bases and their ability to provide an adaptive, multi-scale functional approximation. Our approach thus establishes a bridge between geometric flexibility and differentiable learning, providing a theory-based framework for designing data-driven activation functions.

Article Details

Section

Articles

How to Cite

FRACCN: Fractional spline neural networks. (2026). Gulf Journal of Mathematics, 22(2). https://doi.org/10.56947/gjom.v22i2.4041