EFFECTIVE NEURAL NETWORKS TRAINING BASED ON INITIAL SOLUTION SELECTION

Main Article Content

V. KRASNOPROSHIN
V. MATSKEVICH

Abstract

The paper deals with a problem related to neural network training. Optimization space inhomogeneity definition was introduced and its existence criteria was proven. It was shown that training algorithm speed and quality depends on, in particular, from initial solution selection. Initial solution generation procedure was proposed, which takes into account space inhomogeneity property, specifics of problem being solved and another factors, which made possible to increase training quality. It was shown that this procedure can be applied to any neural network architecture and input data. Soling applied problem experiments results are given, which approves approach efficiency to training.

Article Details

How to Cite
KRASNOPROSHIN, V., & MATSKEVICH, V. (2026). EFFECTIVE NEURAL NETWORKS TRAINING BASED ON INITIAL SOLUTION SELECTION. Vestnik of Polotsk State University. Part C. Fundamental Sciences, (1), 2-9. https://doi.org/10.52928/2070-1624-2026-46-1-2-9
Author Biographies

V. KRASNOPROSHIN, Belarusian State University, Minsk

д-р техн. наук, проф.

V. MATSKEVICH, Belarusian State University, Minsk

канд. техн. наук

References

Narkhede M. V., Bartakke P. P., Sutaone M. S. A review on weight initialization strategies for neural networks // Artificial Intelligence Review Springer. – 2022. – Vol. 55. – P. 291–322. – DOI: 10.1007/s10462-021-10033-z.

Gradinit: Learning to initialize neural networks for stable and efficient training / C. Zhu, R. Ni, Z. Xu et al. // Advances in Neural Information Processing Systems. – 2021. – Vol. 34. – P. 16410–16422. – DOI: 10.48550/arXiv.2102.08098.

de Sá G.A.G., Fontes C. H., Embiruçu M. A new method for building single feedforward neural network models for multivariate static regression problems: a combined weight initialization and constructive algorithm // Evolutionary Intelligence. Springer. – 2024. – Vol. 17, iss. 2. – P. 1221–1233. – DOI: 10.1007/s12065-022-00813-z.

Ejaz S., Khurshid K. Power Spectral Density Based Weight Initialization Technique for Feed-Forward Neural Network // 2025 International Conference on Emerging Technologies in Electronics, Computing, and Communication (ICETECC) IEEE. – 2025. – P. 1–6. – DOI: 10.1109/ICETECC65365.2025.11070241.

Artificial neural networks training algorithm integrating invasive weed optimization with differential evolutionary model. / A. A. Movassagh, J. A. Alzubi, M. Gheisari et al. // Journal of Ambient Intelligence and Humanized Computing. Springer. – 2023. – Vol. 14. – P. 6017–6025. – DOI: 10.1007/s12652-020-02623-6.

Kaveh M., Mesgari M. S. Application of Meta-Heuristic Algorithms for Training Neural Networks and Deep Learning Architectures: A Comprehensive Review // Neural Processing Letters. Springer. – 2023. – Vol. 55. – P. 4519–4622. – DOI: 10.1007/s11063-022-11055-6.

Abdulkadirov R., Lyakhov P., Nagornov N. Survey of optimization algorithms in modern neural networks // Mathematics. MDPI. – 2023. – Vol. 11, iss. 11. – DOI: 10.3390/math11112466.

Reyad M., Sarhan A. M., Arafa M. A modified Adam algorithm for deep neural network optimization // Neural Computing and Applications, Springer. – 2023. – Vol. 35, iss. 23. – P. 17095–17112. – DOI: 10.1007/s00521-023-08568-z.

Naidu G., Zuva T., Sibanda E. M. A Review of Evaluation Metrics in Machine Learning Algorithms // Artificial Intelligence Application in Networks and Systems. CSOC 2023. Lecture Notes in Networks and Systems. Springer, Cham. – 2023. – Vol 724. – P. 15–25. – DOI: 10.1007/978-3-031-35314-7_2.

Heydarian M., Doyle T. E., Samavi R. MLCM: Multi-Label Confusion Matrix // IEEE Access. – 2022. – Vol. 10. – P. 19083–19095. – DOI: 10.1109/ACCESS.2022.3151048.

Мацкевич В. В. Выбор начального приближения в задачах обучения нейронных сетей // Информационные системы и технологии : материалы XI Междунар. науч. конгр. по информатике (CSIST-2025), г. Минск, 29–31 окт. 2025 г.: в 2 ч. / Бел. гос. ун-т; редкол.: С. В. Абламейко (гл. ред.) [и др.]. – Мн., 2025. – Ч. 2. – С. 196–203.

Мацкевич В. В. Об эффективности обучения нейронных сетей // XXV Междунар. науч.-техн. конф., посвященная 100-летию ректора ППИ Николая Петровича Сергеева и 80-летию Победы в Великой отечественной войне «Проблемы информатики в образовании, управлении, экономике и технике», г. Пенза, РФ, 31 окт. – 1 нояб. 2025 /Пензенский гос. ун-т. – Пенза, 2025 – С. 47–55.

Landscape’s non-natural changes detection system by satellites images based on local areas / X. Zhou, Q. Bu, V. Matskevich, A. Nedzved // Pattern Recognition and Image Analysis. Advances in Mathematical Theory and Applications. – Vol. 34, iss. 2. – Pleidas Publishing, 2024. – P. 365–378. – DOI: 10.1134/S1054661824700159.

Krasnoproshin V.V., Matskevich V. V. Random search in neural networks training // Pattern Recognition and Image Analysis. Advances in Mathematical Theory and Applications. – Vol. 34, iss. 2. – Pleidas Publishing, 2024. – P. 309–316. – DOI: 10.1134/S105466182470010X.