arXiv:2509.06694v3 Announce Type: replace-cross Abstract: While artificial neural networks are known as universal approximators for continuous functions, many modern approaches rely on overparameterized architectures with high computational cost. In this work, we introduce the Barycentric Neural Network (BNN): a compact shallow architecture that encodes both structure and parameters through a fixed set of base points and their associated barycentric coordinates. We show that the BNN enables the exact representation of continuous piecewise linear functions (CPLFs), ensuring strict continuity across segments. Given that any continuous function on a compact domain can be uniformly approximated by CPLFs, the BNN emerges as a flexible and interpretable tool for function approximation. To enhance geometric fidelity in low-resource scenarios, such as those with few base points to create BNNs or limited training epochs, we propose length-weighted persistent entropy (LWPE): a stable variant of persistent entropy. Our approach integrates the BNN with a loss function based on LWPE to optimize the base points that define the BNN, rather than its internal parameters. Experimental results show that our approach achieves superior and faster approximation performance compared to standard losses (MSE, RMSE, MAE and LogCosh), offering a computationally sustainable alternative for function approximation.
