Vol.18, No.6, 2022, pp.1667-1682, doi:10.32604/fdmp.2022.019768
An Analysis of the Factors Influencing Cavitation in the Cylinder Liner of a Diesel Engine
  • Dehui Tong1,2, Shunshun Qin1,2,*, Quan Liu1,2, Yuhan Li3, Jiewei Lin2,3
1 State Key Laboratory of Engine Reliability, Weifang, 261061, China
2 Weichai Power Co., Ltd., Weifang, 261061, China
3 State Key Laboratory of Engines, Tianjin University, Tianjin, 300350, China
* Corresponding Author: Shunshun Qin. Email:
Received 13 October 2021; Accepted 22 February 2022; Issue published 27 June 2022
Avoiding cavitation inside the water jacket is one of the most important issues regarding the proper design of a diesel engine’s cylinder liner. Using CFD simulations conducted in the frame of a mixture multiphase approach, a moving grid technology and near-wall cavitation model, in the present study the factors and fluid-dynamic patterns that influence cavitation are investigated from both macroscopic and mesoscopic perspectives. Several factors are examined, namely: wall vibration, water jacket width, initial cavitation bubble radius, coolant temperature, and number of bubbles. The results show that reducing the cylinder liner vibration intensity can significantly weaken the cavitation. Similarly, increasing the water jacket width is instrumental in avoiding cavitation. Increasing the coolant temperature reduces the microjet velocity related to bubble collapse, while increasing the number of bubbles produces a much larger water hammer pressure that can cause more damage to the cylinder liner.
Cavitation; cavitation dynamics; diesel engine; two-phase flow; water hammer
Cite This Article
Tong, D., Qin, S., Liu, Q., Li, Y., Lin, J. (2022). An Analysis of the Factors Influencing Cavitation in the Cylinder Liner of a Diesel Engine. FDMP-Fluid Dynamics & Materials Processing, 18(6), 1667–1682.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.