A Brief History of Temperature Measurement

Temperature measurement has been crucial in human history, science, medicine, and many other fields. Early in human history, temperature was measured using essential tools such as mercury-in-glass thermometers. Since then, several revolutionary developments have led to today’s sophisticated electronic thermocouples and infrared sensors. This article is a concise review of these developments, with much detail omitted, though some references delve into these details quite profoundly. Further, it is an exciting journey that spans almost 20 centuries from the early history of measurement to the present day, in which thermodynamics plays an increasingly important role in the science and technology of our world.

Early Attempts at Temperature Measurement

Ancient Civilizations’ Understanding of Temperature

For example, the Egyptians used the concept of hot and cold to detect temperature. They considered the sun the source of fire and warmth and believed metals and water to be cold. The Greeks, too, knew of the concept of hot and cold and followed this belief by quantifying temperature using such terms as hotter and colder.

Initial Techniques and Tools Used

The earliest attempts at temperature measurement were based on simple tools and techniques. One such method relied on the human body as a reference. People would compare the temperature of an object to that of their skin. This subjective scale was highly variable from person to person and was not precise—another early method involved using the expansion and contraction of materials to gauge the temperature. The Chinese, for example, used the expansion of liquids in calibrated vessels to indicate temperature changes. However, these methods were crude and did not evolve into a standardized thermometer scale.

Galileo’s Thermoscope

Invention and Functioning of the Thermoscope

The Italian scientist and mathematician Galileo Galilei is credited with the invention of the thermoscope in the 16th century. The device was based on measuring temperature as a change in density. A glass tube filled with air or water was fitted with a bulb; as the temperature of the air or water in the bulb changed, the liquid expanded or contracted, and its level in the tube rose or fell, allowing an observer to visualize the temperature change. The invention of the thermoscope was an advancement in the much larger temperature-measurement field, providing a means of quantifying temperature, which earlier methods did not.

Impact and Limitations of Galileo’s Thermoscope

Galileo’s thermoscope significantly advanced temperature measurement, allowing for a more quantitative approach than previous methods. However, it still had limitations. The lack of a standardized scale made comparing readings between different thermoscopes and locations difficult. Additionally, the thermoscope was sensitive to changes in atmospheric pressure, which affected the accuracy of temperature measurements.

The Invention of the Liquid-in-Glass Thermometer

Role of Ferdinand II

In the early 18th century, Ferdinand II, the Grand Duke of Tuscany, significantly contributed to temperature measurement by inventing the liquid-in-glass thermometer. Ferdinand II used spirits of wine (ethanol) as the liquid in his thermometer, which had advantages over water or mercury due to its lower freezing point and broader temperature measurement range. This invention marked a significant step forward in the field and laid the foundation for future advancements.

Dependency on Atmospheric Pressure

Liquid-in-glass thermometers did have some drawbacks, however. One of the most significant limitations of liquid-in-glass thermometers was that they were sensitive to changes in atmospheric pressure, which affected the accuracy of the measurements they made. To reduce this problem, sealed thermometers were developed that reduced the effect of changes in pressure.

Development and Improvements

As the technology evolved, other improvements were developed to the liquid-in-glass thermometer. For instance, using mercury as a liquid in the medium significantly improved the accuracy, range of measurement, and ease of use of liquid-in-glass thermometers. Another critical advance was the temperature calibration on the glass tube, which allowed for standardized temperature measurements. By making the scales read the same temperatures for the same environmental conditions, they ensured that temperatures could be compared from one instrument to another or between different locations.

Invention of the Mercury Thermometer

Daniel Fahrenheit’s Contribution

Daniel Gabriel Fahrenheit made a significant contribution to the field of temperature measurement in the early 18th century that continues to be highly influential. The mercury-in-glass thermometer was introduced by Fahrenheit, which exploited the unique properties of mercury to measure temperature accurately. A larger temperature range was possible thanks to mercury’s properties, which significantly increased the accuracy of temperature measurements. The development of the thermometer represented a quantum leap forward in technology that established the foundations for standardized temperature measurement and enabled a significant breakthrough shortly afterwards.

The Concept of Freezing and Boiling Points

The mercury thermometer also made it possible to create standardized temperature scales by defining stable reference points. The introduction of stable reference points (for example, water’s freezing and boiling points) was a leap made possible by Fahrenheit’s mercury thermometer. Fahrenheit showed that by assigning a value of 32°F to water’s freezing point and 212°F to its boiling point, he created a consistent scale that allowed for accurate temperature measurements.

Creation of the Celsius Scale

Anders Celsius and the Centigrade Scale

Swedish astronomer Anders Celsius developed The Celsius scale in the mid-18th century. He designed the centigrade scale, dividing the range between water’s freezing and boiling points into 100 equal intervals. It was an example of the newly discovered decimal division, which was easier for most people to understand and compute.

Adoption and Adaptation Over Time

The centigrade scale popularized by Celsius underwent further adaptation. The scale was eventually designated as the “Celsius” in honour of its developer. It became the standard for international temperature measurement, particularly within scientific and academic circles. Widely adopted, establishing the Celsius scale allowed for increased consistency and comparability in temperature readings worldwide.

Comparative Overview of Temperature Scales

The Fahrenheit, Celsius, and Kelvin temperature scales are compared in the following table, which gives the temperature in each scale to one decimal place at many places where the temperature is essential to know. In these cases, ordinary reference temperatures — from scientific, as well as ordinary, life — are shown in the three scales. Practical comparison of temperatures can say a lot about the levels of precision or approximation appropriate in the various scales and the roles of temperature measurement in science, industry, collateral technologies of civilization, and daily life.

The table below compares the values of the Fahrenheit, Celsius, and Kelvin temperatures for some common reference temperatures.

Temperature Scales Significance of Temperature
Fahrenheit Celsius Kelvin
9,944.45°F 5,506.92°C 5,780.07 K Black body temperature of the visible surface of the Sun
6,169.76°F 3,409.87°C 3,683.02 K Freezing point of tungsten
3,034.26°F 1,667.92°C 1,941.07 K Freezing point of titanium
1,984.32°F 1,084.62°C 1,357.77 K Standard freezing point of copper
1,947.53°F 1,064.18°C 1,337.33 K Standard freezing point of gold
1,763.20°F 961.78°C 1,234.93 K Standard freezing point of silver
1,220.58°F 660.32°C 933.47 K Standard freezing point of aluminum
787.15°F 419.53°C 692.68 K Standard freezing point of zinc
449.47°F 231.93°C 505.08 K Standard freezing point of tin
313.88°F 156.60°C 429.75 K Standard freezing point of indium
212°F 100°C 373.15 K Standard boiling point of water
136°F 57.78°C 330.93 K World record high air temperature
98.60°F 37°C 310.15 K Human body temperature reference
85.58°F 29.76°C 302.91 K Standard melting point of gallium
68°F 20°C 293.15 K Room temperature reference
39.15°F 3.97°C 277.12 K Temperature of maximum water density
32.02°F 0.01°C 273.16 K Triple point of water
32°F 0°C 273.15 K Standard freezing point of water
0°F -17.78°C 255.37 K Fahrenheit’s zero
-37.90°F -38.83°C 234.32 K Triple point of mercury
-128.56°F -89.20°C 183.95 K World record low air temperature
-308.82°F -189.34°C 83.81 K Triple point of argon
-361.82°F -218.79°C 54.36 K Triple point of molecular oxygen
-415.47°F -248.59°C 24.56 K Triple point of neon
-434.82°F -259.35°C 13.80 K Triple point of molecular hydrogen
-459.67°F -273.15°C 0 K Thermodynamic absolute zero

The formula table below shows the relationships among Fahrenheit, Celsius, and Kelvin temperatures.

 

tF tF = (tC×9/5)+32 tF = (TK×9/5)-459.67
tC = (tF-32)×5/9 tC tC = TK-273.15
TK = (tF+459.67)×5/9 TK = tC+273.15 TK

The Kelvin Scale and Absolute Zero

Lord Kelvin and the Need for an Absolute Scare

A need for an absolute temperature scale was recognized by William Thomson, also known as Lord Kelvin, in the mid-19th century. Kelvin’s research led him to understand that there was a point at which the energy of a system was minimal and molecular motion came to a standstill (“absolute zero”). He created an absolute temperature scale based in part on the concept.

Understanding Absolute Zero

Absolute zero itself — 0 Kelvin — represents the lowest possible temperature. At this point, all molecular motion stops, and, as a result, no further energy can be extracted from the system. Since the Kelvin scale represents temperatures uniformly and universally understandable, it is critical in scientific and research fields where precise temperature calculations are necessary.

Impact on Scientific and Research Fields

The Kelvin scale and the concept of absolute zero have impacted varying scientific and research fields. The ability to measure temperature on an absolute scale has allowed for more precise calculations and predictions in physics, chemistry, engineering, and other disciplines. It has provided scientists with a fundamental reference point for understanding the behaviour of matter and energy at extremely low temperatures.

Modern Thermometers and Thermostats

Development of Bi-Metallic Strip Thermometers

The development of bi-metallic strip thermometers in the late 19th century revolutionized temperature measurement. A bi-metallic strip comprises two metals bonded together, each with varying expansion coefficients. The metals expand or contract at different rates as the temperature changes, causing the strip to bend. This bending motion provides a visual indication of temperature changes.

Invention and Use of Electronic Thermometers

The invention of electronic thermometers in the mid-20th century marked a significant leap forward in temperature measurement. Electronic thermometers use solid-state temperature sensors, such as thermistors or thermocouples, to detect temperature changes. These sensors convert temperature changes into electrical signals, which can be more readily measured and displayed digitally. The result is enhanced accuracy, speed, and precision compared to traditional mercury or liquid-in-glass thermometers.

Transition to Digital Temperature Reading

Advancements in electronic technology have allowed for digital temperature readings and remote monitoring systems. With a digital thermometer, a temperature reading is instantaneous and read, with no concerns about mercury poisoning or the breakage of glass tubes. Incorporating wireless capability permits remote monitoring of temperature. This is useful in applications such as laboratories, industrial manufacturing and storage, food service, and medical monitoring, should an individual need to be kept under constant vigilance.

Infrared Thermometers and Non-Contact Temperature Measurement

Evolution and Principle of IR Thermometers

Infrared (IR) thermometers represent a significant innovation in temperature measurement and can be used in various applications. An IR thermometer operates on the principle of detecting the infrared radiation emitted by an object to determine its surface temperature. IR thermometers use a lens to focus the IR radiation from an object onto a detector. The detector converts the IR radiation into an electrical signal and then converts it into a temperature reading. This allows for non-contact temperature measurement, providing convenience, safety, and ease of use in several applications.

Use cases: Industrial and Medical Applications

IR thermometers have found extensive use in industrial and medical settings. In industrial applications, they enable non-contact temperature measurement of hot, moving objects located in hazardous environments. IR thermometers allow for efficient and accurate monitoring of equipment, machinery, and processes without physical contact.

In the medical field, IR thermometers provide non-invasive temperature measurement, eliminating the discomfort associated with traditional methods such as oral or rectal thermometers. They enable quick and hygienic temperature readings, making them suitable for hospitals, clinics, and home healthcare.

Advances in Micro and Nanoscale Temperature Measurement

Impact of Nanotechnology on Temperature Measurement

The field of nanotechnology has brought revolutionary advancements to temperature measurement at the micro and nanoscale. Researchers have developed susceptible sensors capable of detecting minute temperature changes by utilizing nanoscale materials and structures. These nanosensors allow for precise temperature measurements in small and confined spaces, opening new possibilities in microelectronics, biotechnology, and materials science.

Use in Medicine and Industry

Micro and nanoscale temperature measurement techniques have found applications in medicine and industry. In medicine, they enable precise temperature monitoring at the cellular and subcellular level, facilitating advancements in diagnostics, drug delivery, and tissue engineering. In industry, these techniques play a crucial role in controlling and optimizing manufacturing processes, ensuring the quality and performance of products, and enhancing energy efficiency.

The Future of Temperature Measurement

Predicted Advancements in Technology

Shortly, the state of temperature measurement is expected to be influenced by ongoing technological developments and rising demands from many industries. Predicted progress includes the subsequent design of wireless and remote temperature sensors, the further miniaturization of sensors for wearable devices, and the incorporation of artificial intelligence to enable superior data analysis and prediction.

Potential Impact on Science, Industry, and Daily Life

The future of temperature measurement could dramatically alter the field of scientific research, the development of industrial processes, and our everyday lives. More accurate and precise temperature measurement methods will help improve the ability to understand and control ever more complex systems, leading to significant advances in materials science, energy, and healthcare. Integrating temperature sensors in various formats and devices will also make it easier to gain enhanced energy efficiency, safety, and convenience in applications and devices from automobiles to home automation.
In the pursuit of technological advancement and precision in temperature measurement, the need for high-quality chemicals and reliable suppliers is paramount. Discover a world of trusted suppliers at ChemicalSuppliers.com, your gateway to connecting with industry-leading chemical suppliers.

In Summary

From the ancient civilizations’ initial, rudimentary attempts to the state of today’s art, the field of temperature measurement has undoubtedly come a long way. Whether it was the initial thermoscope of Galileo or the creation of the mercury thermometer, the Celsius and Kelvin scales, each which well over 300 years old, or the advances in infrared thermometers to enable non-contact temperature measurement at thousands of degrees, or the ability to measure temperature at nanoscale, industry has used these capabilities to expand into more scientific research, higher temperature processes and device applications that are allowing temperatures to be measured in ways Galileo would have never believed. The future wouldn’t seem to have nearly the excitement of this past. Still, technologies that are expected to, in some ways, ubiquitous wireless temperature sensors and machine learning to provide them with analytical intelligence suggest that, in their ways, our adaptive ancestors’ visions may come to pass.

Do you need quality Chemical Suppliers?

Here at Chemical Suppliers, we enable buyers to effortlessly discover and connect with top chemical suppliers worldwide. Ready to simplify your sourcing?