Open Access iconOpen Access

ARTICLE

Modeling Pruning as a Phase Transition: A Thermodynamic Analysis of Neural Activations

Rayeesa Mehmood*, Sergei Koltcov, Anton Surkov, Vera Ignatenko

Laboratory for Social & Cognitive Informatics, National Research University Higher School of Economics, Sedova St. 55/2, Saint Petersburg, 192148, Russia

* Corresponding Author: Rayeesa Mehmood. Email: email

(This article belongs to the Special Issue: Advances in Deep Learning and Neural Networks: Architectures, Applications, and Challenges)

Computers, Materials & Continua 2026, 86(3), 99 https://doi.org/10.32604/cmc.2025.072735

Abstract

Activation pruning reduces neural network complexity by eliminating low-importance neuron activations, yet identifying the critical pruning threshold—beyond which accuracy rapidly deteriorates—remains computationally expensive and typically requires exhaustive search. We introduce a thermodynamics-inspired framework that treats activation distributions as energy-filtered physical systems and employs the free energy of activations as a principled evaluation metric. Phase-transition–like phenomena in the free-energy profile—such as extrema, inflection points, and curvature changes—yield reliable estimates of the critical pruning threshold, providing a theoretically grounded means of predicting sharp accuracy degradation. To further enhance efficiency, we propose a renormalized free energy technique that approximates full-evaluation free energy using only the activation distribution of the unpruned network. This eliminates repeated forward passes, dramatically reducing computational overhead and achieving speedups of up to 550× for MLPs. Extensive experiments across diverse vision architectures (MLP, CNN, ResNet, MobileNet, Vision Transformer) and text models (LSTM, BERT, ELECTRA, T5, GPT-2) on multiple datasets validate the generality, robustness, and computational efficiency of our approach. Overall, this work establishes a theoretically grounded and practically effective framework for activation pruning, bridging the gap between analytical understanding and efficient deployment of sparse neural networks.

Keywords

Thermodynamics; activation pruning; model compression; sparsity; free energy; renormalization

Cite This Article

APA Style
Mehmood, R., Koltcov, S., Surkov, A., Ignatenko, V. (2026). Modeling Pruning as a Phase Transition: A Thermodynamic Analysis of Neural Activations. Computers, Materials & Continua, 86(3), 99. https://doi.org/10.32604/cmc.2025.072735
Vancouver Style
Mehmood R, Koltcov S, Surkov A, Ignatenko V. Modeling Pruning as a Phase Transition: A Thermodynamic Analysis of Neural Activations. Comput Mater Contin. 2026;86(3):99. https://doi.org/10.32604/cmc.2025.072735
IEEE Style
R. Mehmood, S. Koltcov, A. Surkov, and V. Ignatenko, “Modeling Pruning as a Phase Transition: A Thermodynamic Analysis of Neural Activations,” Comput. Mater. Contin., vol. 86, no. 3, pp. 99, 2026. https://doi.org/10.32604/cmc.2025.072735



cc Copyright © 2026 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 435

    View

  • 205

    Download

  • 0

    Like

Share Link