Heat transfer through valve guides
In most race engines, much of the heat transfer from the valve to the cooling medium – whether it be oil, water or air – is via the valve seat insert, then into the head and eventually to a fluid that will reject the heat to atmosphere.
In the case of the inlet valve, heat flux is only very modest compared to that in an exhaust valve. The inlet valve is kept cool owing to the flow of cool inlet charge over it, and very little heat is rejected to anywhere other than the valve seat. The exhaust valve is a different matter, with extremely high heat flux not only into the head but also the area behind the head. Here, there can be considerable heat transfer through the valve guide.
However, the proportion of the heat rejected through the valve guide, and so the total amount of heat transfer, is increased greatly when sodium cooling of the exhaust valve is used. The result of the higher overall heat transfer is that the valve operates at lower temperatures, which generally means that the allowable stresses for the valve increase. Whether the design and development engineers take advantage of this by reducing valve cross-section, or simply have a greater factor of safety, is a choice for them to make.
There are other benefits to having this greater heat transfer through the valve guide. As the valve head runs cooler, the instantaneous contraction as it comes into contact with the relatively cold valve seat is reduced. This in turn reduces the tensile stresses in the periphery of the valve due to the seat area trying to contract around a hot ‘core’. So, in addition to the increased allowable stresses, we can lower the tensile stress in the periphery of the exhaust valve head during part of the valve’s operating cycle by rejecting more heat through the valve guide by using sodium cooling.
The inlet valve does not however benefit to the same extent from increasing the heat transfer through the valve guide, although the effects noted will be the same. The real benefit for an inlet valve is likely to be due to the reduced valve temperatures. There is some heat exchange from the valve to the inlet charge, with the valve being cooled by the inlet charge. It is clear that the charge must be heated by the valve, and this naturally has a small but deleterious effect on charge temperature. The slightly increased charge temperature will give rise to lower charge density and lower volumetric efficiency.
So, there is an argument for increasing heat transfer through the inlet valve guide, but probably more for reasons of slightly increasing performance than reliability. There isn’t strong evidence of this having been used widely in racing. Some of the 1950s works Norton motorcycles used sodium-cooled inlets, but whether Norton’s reasoning was to improve performance or to try to solve a reliability problem isn’t known.
In terms of reliability and performance, the use of sodium-cooled inlet valves is likely to look more attractive if the valve is hot, and the increasing use of turbocharged engines might see such valves resurrected in modern race engines.
Written by Wayne Ward
Link to original article