Cold start / coolant temperature controlTags : electronics
This may mark me as some kind of ancient relic but at one time the height of engine sophistication was the part-throttle weakening device. Designed to improve fuel economy by running the engine lean when the carburettor throttle plate was away from wide-open throttle, this ingenious instrument was perhaps the pinnacle of engine management technology in its time. Injection systems were around but were predominantly mechanical in nature, very temperamental and gave very poor air fuel ratio control. Little wonder that when emission legislation was starting to gather momentum, the search was soon on to produce an efficient and reliable method of fuelling an engine, which was not only technically advanced, but cheap too. Single point injection made way for multipoint injection systems and then eventually even multi injectors per cylinder systems and although it seems these have been around a long, long time, much of what we readily accept today has come about only in the last 20 or 25 years.
But the key word in all of this is reliability, and in particularly, the reliability of the engine and exhaust emission sensors. Placed in some of the most inhospitable surroundings, these electronic components were not only requested to cope with all manner of under bonnet conditions of grime, temperature and vibration but they have to perform 100%, all of the time to levels of reliability hitherto undreamed of.
One such sensor is the engine coolant temperature sensor, or the ECT, as it is more widely known. Used to control the engine during warm-up, ECTs are essentially thermistors (a semi-conductor resistor) inside a hollow sealed brass tube. Having a negative temperature coefficient (NTC) and a large resistance at cold ambient temperatures, the resistance of these thermistors drops significantly as the coolant temperature approaches the more normal working conditions of an engine. They are therefore ideal to control the amount of fuel required during the warm-up phase.
An engine starting from cold will require an enriched fuel air mixture. This is to compensate for the condensation of the liquid fuel on the ‘cold’ surfaces in the intake tract and piston crown/combustion chamber and ensure that a combustible mixture is presented at the spark plug for first fire. During cranking, most ECMs will take a signal from the ECU and running ‘open loop’, inject about 30 to 60% more fuel than normally required according to some pre-determined algorithm established by either the equipment supplier or the engine manufacturer. Adding lots of fuel on initial cranking, this then trails away after a couple of seconds to ensure that, should then engine not fire, the engine isn’t flooded. However, once the engine fires (often defined by the engine RPM being greater than a threshold value - generally around 400 RPM), the engine will move into its after-start enrichment strategy.
Beginning as a user-defined enrichment (typically around 20%) this after-start strategy will slowly ramp down to zero after typically two hundred or so trigger events, finally gliding down into the coolant temperature correction map. Thus for the rest of the time the coolant temperature sensor is in control, eventually deciding when closed loop control of the fuel/air mixture, should you wish, takes over as well as many other temperature related events.
In slightly more sophisticated systems, in addition to the fuelling, coolant temperature corrections can also be made to the ignition map to compensate for the slower flame speed of rich mixtures. Often called the master sensor, it is little wonder that its robustness and reliability are of paramount importance.
From a time when the only cold start control device was the carburettor ‘choke,’ engine fuelling technology has moved a long way.
Written by John Coxon.