It is common for individuals to get overwhelmed by the task of choosing temperature sensors for specific applications. Engineers have different options, so individuals unfamiliar with calibrations can feel lost. Learning about the details of the two most popular sensors can help you understand the pros and cons of each. What are the main differences between thermistors and RTD sensors?
An RTD, or resistance temperature detector, is a sensor that is designed to measure temperature that is based on resistance changes inside of a metal resistor. Of all RTDs on the market, the PT100 sensor is the most popular. It uses platinum, which allows the sensor to have a resistance of 100 ohms at close to 0°C. These sensors are typically used to measure temperature in both industrial and laboratory processes. They are widely known as accurate, stable, and offer high repeatability. In fact, the PT100 is considered one of the most accurate sensors available.
Thermistors share similarities with RTDs, but they are different enough where the two should not be used interchangeably. A thermistor sensor is constructed from sintered, semiconductor materials that are known to exhibit a significant change in resistance that is directly proportional to minimal changes in temperature. Unlike RTDs, a thermistor has non-linear changes in resistance. The devices reduce their resistance when the temperature increases. The main reasons to choose a thermistor over an RTD are:
Sensor Scientific, Inc | All Rights Reserved