Аннотация:This work focuses on the implementation and investigation of a novel pruning method for deep convolutional neural networks, DPIREC (Dynamic Pruning by Importance of Random Excluded Channels), which enables pruning directly during training. Thecore idea of this method is to apply random masks to the channels of convolutional layers during each training iteration, temporarily excluding certain channels from the training process. Subsequently, the importance values of the channels used in thecurrent step are updated. Repeating these actions over multiple training epochs allows the identification of channels with the least impact on the loss function, which can then be permanently excluded from the pruned neural network. We conducted a comprehensive analysis of approaches to assessing parameter relevance. Based on this, a new approach was proposed that takes into account the dynamics of changes in model accuracy. Comparative experiments demonstrated the superiority of theDPIREC method over several existing techniques. When pruning the ResNet18 convolutional neural network during training on the CIFAR-100 dataset by 40%, the accuracy loss was 0.89%.