You can achieve accurate resistancetemperaturedetector sensor measurements without having to resort to using a precision current source.
Traditionally, resistancetemperaturedetector (RTD) sensor resistance is measured by applying a precision current source and measuring the developed voltage. This approach usually requires a precision voltage reference to create the current source, followed by a highquality analogtodigital converter (ADC) to measure the voltage.
This isn’t difficult to achieve at room temperature, but when you consider that the temperature of your measuring system can be in the range of −40 to +55 °C, the task becomes more daunting.
A bruteforce approach to this problem would be to use an expensive temperaturestable voltage reference, ADC, and other components combined with software calibration to compensate for the temperature drift of parameters. This approach is complex and will fail to achieve the high precision that borderlines the sensor accuracy.
A better approach was discovered using 5 ppm/°C ultrastable resistors with 0.1% accuracy as a reference for RTD measurements. This approach requires two onboard ultrastable resistors for calibration (1 k and 2 k) to achieve high RTD precision. Those resistors are used to calibrate the RTD reading and compensate for temperaturedrift errors.
The design uses the Q1Q3 transistors in combination with the R1 resistor to form a constantcurrent source that sources about 1 mA using a 2.5V ADC voltage reference (see Figure 1). Calibration resistors R4 and R5 along with the RT1 and RT2 RTD sensors can sink this current when the corresponding GPIO pin is driven low. When not used, the GPIO pins are tristated. The voltage is measured at the ADC output.
Figure 1.  The Q1Q3 transistors and R1 resistor form a constantcurrent source that sources about 1 mA using a 2.5V ADC voltage reference. 
For calibration, we will need to read two resistors and calculate the constantcurrent source value and combined errors, which we call V_{OFFSET}. The calibrated I_{CC} and V_{OFFSET} values are used to convert RTD temperature readings.
Table 1.  The Q1Q3 transistors and R1 resistor form a constantcurrent source that sources about 1 mA using a 2.5V ADC voltage reference. 


The calibration results (see Table 1) are applied using following formula:
(1) 
where:
R_{RTD} is the measured RTD resistance;
V_{ADC} is the ADC voltage reading;
I_{CC} is a somewhat constantcurrent source;
V_{OFFSET} is the voltage offset of cumulative errors.
Note that V_{OFFSET} is a combination of multiple error voltage sources. Therefore, it might be beneficial (though not necessary) to split into its components to get better precision.
To calculate V_{OFFSET} and I_{CC}, we need to make a few assumptions to derive the formula below:
 First assumption: Calibration resistors are ideal and have 1,000 and 2,000Ω values, respectively.
 Second assumption: The I_{CC} current source is stable for the duration of measurements.
 Third assumption: ADC conversion results are perfect.
Following those assumptions, we can write that:
(2) 
In Equation 2, VCAL_{1K} and VCAL_{2K} represent voltages developed on calibration resistors when I_{CC} current is applied.
By solving those equations for I_{CC} and V_{OFFSET}, we get the following equations:
(3) 
(4) 
Experimental Setup Measurements
The experimental setup has two calibration resistors and two RTDs mounted on different locations. We used an ADC with 10bit resolution and surfacemount RTDs rated 1 k at room temperature. Notice how the calibration values change when board temperature changes between sample 2 and 3 in the Table 1.
To gather the data, the software followed these steps:
 Read ADC voltage levels on calibration resistors and RTDs.
 Calculate V_{OFFSET} using Equation 3.
 Calculate I_{CC} with Equation 4.
 Determine RTD resistance utilizing Equation 1.
 Convert RTD resistance values to temperature with the table lookup and piecewise interpolation.