At 300 ohms sensor resistance the temp gauge would be on the Cold peg. My NOS D0WY-10884-A (70 351C) temp sensor matches this curve that GAH plotted. It measures 300 ohms at room temp, 39 ohms at 190 deg, and 27 ohms at 212 deg.
The gauges are 10 to 73 ohm devices. So any value above 73 ohms is outside the operating range of the gauge. At 73 ohms the gauge must read no higher than right on the empty mark and about the width of the needle below empty. With no power the gauge should drop lower.
The idea is that the gauge when powered moves up to empty cold or low oil pressure, and when the power is turned off it will drop below that.
All Ford gauges are calibrated at the full position. How they read below that is a mechanical relationship based on degrees of arc of the internal resistor in the fuel and oil pressure gauges and the shape of the curve of the thermistor in the temperature sender.
Senders have slightly different specifications that gauges.
The senders have an operating range of 10 to about 60 ohms. These resistance values are plus or minus about 10%. So at the low end, 10 ohms, 1 ohm matters. At the high end Not so much, any where from 54 to 66 ohms is typical.
To get the gauge to read full, hot, or high oil pressure, the sender must measure no higher than 10 ohms. In practice that means it really needs to be just under 10 ohms to accommodate the series resistance of the harness and connections between the sender and the gauge. The sender should be very close to 23 ohms at the center position. The equivalent temp for '65 to '70 Ford products at mid point is 192 degrees. Oil pressure depends on the sender. The 90psi senders tend to read 23 ohms at about 40 psi. The 80 PSI senders are around 35 or less.
All sender readings are subject to the amount of power in the circuit. The circuit is powered by the Instrument Voltage Regulator. It puts out pulses of 12 to 15 volts depending on the charging voltage level of the car. This heats a bimetallic strip that opens a set of contacts that breaks the circuit. Because we are converting power into heat, as voltage and there for power go up the pulses get shorter, maintaining a steady amount of power going into the gauge circuit. The gauges also operate by heating a bimetallic strip so they are also constantly damping the effect of voltage changes. This also means that you need to give a gauge about 30 seconds to fully settle after making a change in resistance at the sender. Most of the errors in measurement come from this delay.
It is also worth noting that your digital meter (unless it has a very advanced averaging function) is unable to tell you how much power the IVR is putting out. And analog meter will give you a nice wagging pointer deflection but that meter is damped as well so it will be highly inaccurate.
If the IVR is putting out too much power it will be evidenced by all the gauges reading higher than expected. Original IVRs can be adjusted. In both cases the practical way to test and adjust is with the gauge tester.
Bill, Iâm curious what are you seeing on your temp sensor test jig? Reason for asking is that Royce, 70XR7Tom, and I have all measured NOS Autolite sensors with similar results, and none of them measure 23 ohms until above 200 degrees.
C6DZ-B (Royce)
240 @ ambient
50 @ 170
22 @ 210
D0ZZ-A (70XR7Tom)
22 @ 212
D0WY-A (Calicat)
300 @ ambient
54 @ 170
30 @ 205 (this is halfway on my temp gauge, and about where it runs on a summer day)
27 @ 212
My temp gauge reads exactly at C and H on L and H settings of your gauge tester, and 9/16 on M setting. So I donât think my IVR is off by much.
Sounds like your gauge is slightly out of calibration.
Ford originally offered dealers the Burroughs gauge tester and included the correct value for testing Ford gauges. Later they supplied a Rotunda gauge tester and the literature included with it restated the same values.
So the numbers I am sharing are not the result of measurement but design engineering. Given that I these are all 10% tolerance devices. I was doing a lot of hand wringing over this and my MIT PhD EE genius cousin put it instantly in perspective: not rated for space flight.
My gauge being slightly out of cal wouldnât surprise me. But it still puzzles me why 3 separate NOS sensors are 30% off from that 23 ohm at 192 deg spec. Could be the sensors are designed to read mid range at over 200 degrees rather than where the thermostat opens? This seems possible given that 50% antifreeze at 13 psi doesnât boil until 265 deg, and these cooling systems did not have enough cooling capacity to keep the coolant at 192 deg on a hot day.
Edit - I have found several other examples of Ford temp sensors from this time frame that read 22-23 ohms at around 210 degrees. Knowing 23 ohms is intended to be 1/2 on the gauge, 210 degree coolant temp is close to where the Ford intended the gauges to read 1/2.
Iâm using that ânot rated for space flightâ from now on. Brilliant!
