Improving KAN with CDF normalization to quantiles
–arXiv.org Artificial Intelligence
--Data normalization is crucial in machine learning, usually performed by subtracting the mean and dividing by standard deviation, or by rescaling to a fixed range. In copula theory [1], popular in finance, there is used normalization to approximately quantiles by transforming x CDF (x) with estimated CDF/EDF (cumulative/empirical distribution function) to nearly uniform distribution in [0, 1], allowing for simpler representations which are less likely to overfit. It seems nearly unknown in machine learning, therefore, as proposed in [2], we would like to present some its advantages on example of recently popular Kolmogorov-Arnold Networks (KANs), improving predictions from Legendre-KAN [3] by just switching rescaling to CDF normalization. Additionally, in HCR interpretation, weights of such neurons are mixed moments providing local joint distribution models, allow to propagate also probability distributions, and change propagation direction. Data normalization is very useful for various types of analysis, for example, through batch normalization in neural networks [4].
arXiv.org Artificial Intelligence
Jul-21-2025
- Country:
- Europe > Poland > Lesser Poland Province > Kraków (0.04)
- Genre:
- Research Report (0.50)
- Technology: