other

Development of temperature sensors

May 26 , 2023

The perception of heat and cold is fundamental to the human experience, yet figuring out how to measure temperature has eluded many great men. It is unclear whether it was the ancient Greeks or the Chinese who first found a way to measure temperature, but it is well documented that the history of temperature sensors began in the Renaissance.

We started with the challenges of temperature measurement and then presented the history of temperature sensors from different aspects [information from OMEGA Industrial Measurement White Paper document]:




1、The challenge of measurement

Heat is used to measure the amount of energy contained in a whole or object; the greater the energy, the higher the temperature. However, unlike physical properties such as mass and length, heat is difficult to measure directly, so most measurements are indirect, inferring the temperature by observing the effect of the object when heated. Therefore, the standard of measurement of heat has been a challenge.

In 1664, Robert Hooke proposed to use the freezing point of water as the reference point for temperature. Ole Reimer, on the other hand, thought to determine two fixed points, and he chose Hooke's freezing point and the boiling point of water. However, how to measure the temperature of hot and cold objects has always been a problem. in the 19th century, Guy Lussac, a scientist who studied the gas law, and others found that when a gas is heated under constant pressure, a 1 degree Celsius increase in temperature increases the volume by 1/267 (later modified to 1/273.15), deriving the concept of absolute zero - 273.15°C.


2、Observation of expansion: liquid and bimetal

According to reports, Galileo is believed to have built a device that displayed temperature changes around 1592. This device affected the column of water by controlling the contraction of air inside a container, and the height of the column indicated the degree of cooling. However, since this device was susceptible to air pressure, it could only be regarded as a novelty toy.

The thermometer as we know it today was invented by Santorio Santorii in Italy in 1612. He sealed a liquid inside a glass tube and observed its movement as it expanded.

Making some scales on the tubes makes it easier to observe changes, but the system still lacks a precise unit. Working with Reimer was Gabriel Wallenheit. He started producing thermometers using alcohol and mercury as the liquid. Mercury was perfect because it had a linear response to temperature changes over a wide range, but was highly toxic, so it is now used less and less. Other alternative liquids are being investigated, but are still widely used.

The bimetal temperature sensor was invented in the late 19th century. It takes advantage of the uneven expansion of two metal sheets when they are combined. Temperature changes cause the metal sheets to bend, which can be used to activate thermostats or meters similar to those used in air grills. This sensor is not very accurate, perhaps plus or minus two degrees, but it is also very widely used because of its low price.


3、Thermoelectric effect

In the early 19th century, electricity was an exciting field. Scientists discovered that different metals had different resistances and conductivities, and in 1821, Thomas Johnsebeck discovered the thermoelectric effect, whereby different metals joined together and placed at different temperatures could produce voltages. David demonstrated the correlation between metal resistivity and temperature. Becquerel proposed the use of platinum-platinum thermocouples for temperature measurement, and the real device was created by Leopold in 1829. Platinum can also be used in resistance temperature detectors, invented by Myers in 1932. It is one of the most accurate sensors for measuring temperature.

Wire-wound RTDs are not suitable for industrial applications because of their fragile characteristics. The 20th century has also seen the invention of semiconductor temperature measurement devices. Semiconductor temperature measurement devices respond to temperature changes with high accuracy, but until recently, they lacked linearity.

4、Heat radiation

Very hot and molten metals heat up and give off heat and visible light. At lower temperatures, they also radiate heat, but at longer wavelengths. The British astronomer William Herschel first discovered that this "fuzzy" or infrared light produces heat in 1800.

While working with his compatriot Meloni, Robbery discovered a way to detect this radiant energy by connecting thermocouples in a series to produce a thermopile. This was closely by the Radiant Heat Detector in 1878. This was invented by Samuel Langley in the United States, and it used two platinum strips, one blackened in a single-arm bridge arrangement. The infrared radiation heating produced a measurable change in electrical resistance. Radiation calorimetry is sensitive to a wide range of wavelengths of infrared light.

In contrast, the radiation quantum detector type of equipment developed since the 1940s responds only to infrared light in a limited wavelength band. Today, inexpensive pyrometers are widely used, and with the price of thermal imaging cameras declining, applications will become even more widespread.

5、Temperature standard

When Fahrenheit made his thermometer, he realized he needed a temperature scale. He set the freezing point for salt water at 30 degrees and the boiling point for salt water over 180 degrees. 25 years later, Anders Celsius proposed a scale of 0-100, and today's "Celsius" is named after him.

Later, William Thomson discovered the benefit of setting a fixed point at one end of the scale, and Kelvin then proposed setting absolute zero as the starting point of the Celsius system. This led to the Kelvin scale used in science today.



Leave A Message
Any information wanted ? Leave us a message here please.

Home

Products

about

contact