Heat: Difference between revisions
(Created page with "PageConstructionNote") |
No edit summary |
||
(51 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
[[ | {{PageConstructionNotice}} | ||
Intuitively, we have an inkling of what heat is. It's whatever goes into a cuppa coffee that makes it hot. It's what an ice cube lacks. It's the "stuff" that leaks out of the cuppa to cool it off, and into the ice cube to make it melt. | |||
We have all probably also learned that heat has "energy", that not-exactly-substance quantity that can't be generated <i>ex nihilo</i> nor gotten rid of, but can be moved around and is both useful and necessary for getting stuff done. | |||
So what separates heat and its energy from other stuff that has energy, like electricity or light or compressed springs? | |||
Technically, in the science of thermodynamics, heat is defined as a flow of entropy along with its associated energy. In this sense, the coffee in the cup doesn't have heat as such – it has energy and entropy. But it can transfer that energy and entropy via a flow of heat to its environment. | |||
Like energy, entropy can't be gotten rid of but it can be moved somewhere else. Unlike energy, you can make more entropy. In fact, this tends to happen spontaneously whenever you try do do things. So you end up with this stuff – entropy – building up in your system. If too much builds up, your system will break, so you need to get rid of it. In order to get rid of it, you need to move it as heat. And heat unavoidably also involves taking with it some of that energy that you would prefer to use for doing other things. For many applications, you want to figure out how to move as much entropy away as you can while giving up as little energy as possible. | |||
For other systems, some of that entropy might actually be desired – melting steel or driving chemical reactions, for example. And then the job of a thermal management engineer might be figuring out how to manage the flow of energy and entropy into the system. | |||
In this page, we will be exploring heat, how to move heat around, and how to manage energy and entropy flows. | |||
=Heat= | |||
==Energy== | |||
The most basic definition of energy is the "ability to do work", where work is defined as applying a force over a distance. When you delve deeper into physics, the concept of energy becomes ever more fundamental – and trippy. Energy ends up being fundamentally intertwined with the flow of time, and the mere fact that the laws of physics stay the same over time mean that energy is conserved; you can never make it anew nor can you make it go away, but you can change it from one form into another and move it around. Energy also ends up distorting space and time around it, mass is just one form of energy, and other bizarre ideas. | |||
But for the purposes of this article, the "ability to do work" is what we need, as well as the concept of conservation of energy. We need energy to do the stuff we want to do ("do work"), and we can't just make it up on the spot. So we either need to collect it from someplace else or [[Energy_Storage|bring it with us]] (as nuclear fuel, for example). | |||
==Entropy== | |||
Entropy is sometimes described as the amount of "disorder" in a system. Which is a useful if not exact way of looking at it. Fundamentally, entropy comes about because we see only large-scale averaged properties of things, and there are a very large number of ways of arranging all the individual microscopic particles that can give the same macroscopic observed properties. The more ways there are that you can put together all the atoms and electrons and quantum field fluctuations to give the same observed macroscopic state, the higher the entropy of that state. | |||
For example, consider a million gas atoms in a box. You can measure the pressure they exert on the sides of the box, you know how many atoms are there, and you know their total energy. But you don't know exactly where each atom is, nor how that energy is distributed among the atoms (at any given time, one specific atom may have more energy than another specific atom, for example). The bigger the box, the more possible positions there area. The more total energy, the more ways the atoms have of distributing that energy. And if you add more atoms, you have even more ways of arranging things. All of these lead to an increase of entropy. | |||
One of the things you can show from thermodynamics is that if you have entropy, you can't just make it go away. The best you can do is shuffle it off someplace else, where hopefully it won't get in the way of what you are trying to do. However, you can make entropy from nothing! If you merge two macroscopic things, or let them come into contact and exchange material or energy or other macroscopic stuff, you increase the possible number of ways that the stuff can be arranged between them compared to when the things were isolated, and their total entropy goes up. | |||
=Heat transfer and temperature= | |||
So if you have two systems and you bring them in contact, you can get the exchange of energy and entropy between them. On a microscopic scale, the particles bump into each other, allowing them to exchange energy and re-distribute the "disorder" or "randomness" (entropy) of the system. | |||
But which way does the net flow of energy and entropy occur? We define the temperature based on the direction of heat flow – energy and entropy always flow from high temperature to low temperature. If there is no net flow of heat, the temperatures of the two systems are equal and they are said to be in thermal equilibrium. | |||
If you go and do the physics with this definition, it turns out that the temperature will be proportional to the average energy of each way a microscopic particle in the system can continuously store energy. For example, in a gas of free atoms, the gas molecules can move independently along each of the three Cartesian axes, so the average kinetic energy of the gas atoms will be three times the temperature times the proportionality constant. If the gas is made up of molecules that can rotate, the independent ways of rotating also have the same amount of energy; if the atoms in the molecules can vibrate back and forth, these can also get energy from the temperature. Note that only works if you measure temperature in absolute units such as kelvins. Degrees of celcius or farenheit will not give correct results! | |||
If heat flows into or out of a system at (approximately) constant temperature, the energy <math>Q</math> and entropy <math>S</math> of the heat are related to the temperature <math>T</math> by | |||
<div class="center" style="width: auto; margin-left: auto; margin-right: auto;"><math> Q = T \, S </math>.</div> | |||
So to move a given amount of entropy out of a system, the higher the temperature the less energy it takes to do so. Also note that if a given amount of heat energy flows from a system at one temperature to a system at a lower temperature, the entropy will increase, as it must. | |||
Energy that flows into a system that is not associated with heat flow can be used to do work on that system. | |||
==Processes== | |||
===Adiabatic (no heat flow, isentropic)=== | |||
If you have a well-insulated system that you do work on, or if you just do the work fast enough that very little heat has a chance to escape, you approach the limit of zero heat transfer – this is called adiabatic (which I personally think is one of the most awesome obscure words in the English language). If you do adiabatic work on a gas that compresses it, for example, energy will flow into the system. Where is this energy stored? You have the same number of gas molecules, but more energy in them. So each molecule must have more energy on average. In other words, its temperature has gone up, even though no heat has flowed into the system! When you allow the gas to expand adiabatically, it will do work on the surrounding environment and its temperature will drop again, even though it still has no heat flow into or out of the gas. This is why in thermodynamics we don't say that a system "has" heat, only that heat can flow between two systems. The total energy that a system can have can come from a combination of heat flow and work, and either can suffice to bring the system to the same state. | |||
Once you know about adiabatic expansion and contraction, it explains a lot of stuff. Ever wonder why the high mountains are colder than down on the plains? The part of Earth's atmosphere where weather occurs is well mixed, with turbulent plumes of sun-heated gas rising up as far as the stratosphere (where weather stops) and cooler currents sinking again. Heat transfer between air masses is slow compared to the motion of the air, so this mixing is adiabatic. As a plume of air rises, there is less weight on it from all the air above it, so it undergoes adiabatic expansion and its temperature drops. Plunging air masses are likewise adiabatically compressed, and thus gain temperature as they descend. So the higher up you go, the colder it gets. | |||
Or suppose you want to start a fire, and all you have is a long piston. If you put a bit of tinder in the piston, seal it up with the plunger, and rapidly push the plunger down, the air can get so hot it ignites the tinder. In addition to starting camp fires, this is how Diesel engines work – fuel is squirted into a piston and compressed until it spontaneously ignites. No spark plug needed! | |||
Because there is no heat flow during an adiabatic process, its entropy does not change. So sometimes adiabatic processes are called isentropic processes. In a quantum system, adiabatic can be extended to indicate that the system evolves continuously with the way that its current quantum state evolves, without any quantum transitions to other states. | |||
===Isothermal=== | |||
An isothermal process is in some ways the opposite of an adiabatic process. Heat flow occurs so rapidly between the system and a very large external system (called a "bath" or "heat sink") that the system you are working on is able to keep nearly constant temperature. | |||
===Isobaric (constant pressure)=== | |||
In an isobaric process, the pressure on the system is kept nearly the same throughout the process. The work on a system is the force exerted on it times the distance the force is exerted, which is equivalent to the pressure on a system times its change in volume. So for a system kept at constant pressure, it is easy to find the work done on the system, just multiply that pressure by the change in volume of the system. | |||
===Constant volume=== | |||
A system kept at constant volume can do no work, nor have any work done on it. So all energy flows, in or out, must be as heat. | |||
==Methods of heat flow== | |||
===Conduction=== | |||
When an atom with a lot of energy bumps into an atom with less energy, on average the more energetic atom will lose energy in the collision and the less energetic atom will gain energy. When this happens on a large scale, you get a net flow of heat from the hot atoms bumping into the cold atoms and making the cold atoms move around more. This process is called conduction. | |||
For a material with a temperature difference <math>\Delta T</math> across it, with cross sectional area <math>A</math> and length <math>L</math> and thermal conductivity <math>k</math>, the amount of heat flow <math>Q</math> in a given time <math>t</math> is | |||
<div class="center" style="width: auto; margin-left: auto; margin-right: auto;"><math> | |||
Q = - \frac{k \, A \, t \, \Delta T}{L}. | |||
</math></div> | |||
This can be generalized to continuous materials with continuous changes across them by defining a <i>heat flux</i> <math>\vec{q}</math> as the heat flow per unit area per unit time in the direction of the flow, and the <i>temperature gradient</i> <math>\vec{\nabla} T</math> which is the rate at which temperature changes with distance in the direction of the fastest change. | |||
<div class="center" style="width: auto; margin-left: auto; margin-right: auto;"><math> | |||
\vec{q} = - k \, \vec{\nabla} T | |||
</math></div> | |||
===Convection=== | |||
If you have a substance with a high temperature and physically move it out of your system, you will have a flow of heat out of your system. If the high temperature substance enters your system, heat will flow in. Transfer of heat by moving stuff around is known as convection. Because convection usually happens with fluid flows, it is messy and complicated and not terribly easy to describe, with special cases and geometry dependence and dependence on whether the flow is laminar or turbulent and so on. When it happens with flows of granular materials, it gets even more complicated. In this review, we'll content ourselves with just noting that it happens, although you can usually approximate the rate of heat flow as proportional to a temperature difference. | |||
===Radiation=== | |||
The various [[Lasers_and_the_electromagnetic_spectrum|electromagnetic waves]] in the ambient environment (such as visible light, infrared light, microwaves, or x-rays) can interact with atoms, and thus you can transfer heat between the electromagnetic field and a system of atoms and other things. So hot things can radiate their heat away as electromagnetic waves, and cold things can be heated up by the ambient radiation around them. You have likely experienced this - go outside on a sunny day and you will feel the heat from the radiation of the approximately 6000 K temperature sun. | |||
When a system is in thermal equilibrium with the electromagnetic field, it is said to behave as a black body. It gets this name because an object that absorbs all light that hits it will be in equilibrium with the electromagnetic field. A black body radiates across a wide range of frequencies, but the peak frequency will be at | |||
<div class="center" style="width: auto; margin-left: auto; margin-right: auto;"><math> | |||
\lambda_{\rm peak} = 2898 \, \mu{\rm m} / T({\rm K}). | |||
</math></div> | |||
So as temperature goes up the radiation is primarily emitted at shorter and shorter wavelengths – infrared at room temperature, but shifting ro dull red as temperature increases, then orange, yellow, white, and blue-white. | |||
A black body at a temperature <math>T</math> will radiate at an intensity of | |||
<div class="center" style="width: auto; margin-left: auto; margin-right: auto;"><math> | |||
I = \sigma_{SB} \, T^4 | |||
</math></div> | |||
where <math>\sigma_{SB} = 5.670373 \times 10^{-8}</math> W/m²/K^4 is the Stefan–Boltzmann constant. That fourth power of temperature in there means that the power radiated per unit area goes up <i>very</i> quickly with increasing temperature – doubling the temperature gives sixteen times as much radiation! | |||
Objects that are not black will radiate at a reduced power level. In general at a specific frequency of electromagnetic radiation <math>\omega</math>, an object will reflect away a fraction <math>r(\omega)</math>, transmit through a fraction <math>t(\omega)</math>, and absorb a fraction <math>a(\omega)</math>. Together, these must all add up to one | |||
<div class="center" style="width: auto; margin-left: auto; margin-right: auto;"><math> | |||
r(\omega) + t(\omega) + a(\omega) = 1 | |||
</math></div> | |||
and individually they are all between 0 and 1. The quantity <math>a(\omega)</math> is called the absorptivity, <math>r(\omega)</math> the reflectivity, and <math>t(\omega)</math> the transmissivity. | |||
For a black body, <math>a(\omega)=1</math> for all <math>\omega</math>. If this is not the case, the power radiated away at that frequency is decreased by a factor of <math>a(\omega)</math>. If the absorptivity is roughly the same over the spectral range where the object is radiating, its radiated power is overall decreased by a factor of <math>a</math>. | |||
=Thermal properties= | |||
==Enthalpy== | |||
Enthalpy is the sum of a system's internal energy and the product of its pressure and volume. When we consider phases of matter (solid, gas, liquid, et cetera) as thermodynamic systems we find that transitioning from a phase to another requires adding or removing energy from that material. This is also referred to as the enthalpy of a particular phase transition (e.g. enthalpy of vaporization; or, boiling.) | |||
When we consider the process in reverse (e.g. the enthalpy of condensation), the enthalpy is also equal and opposite. These enthalpies also change with ambient conditions such as temperature and pressure; we often measure enthalpies at what's called "standard temperature and pressure (STP)". Of course, there are differing standards. In the United States, one common standard comes from the National Institute of Standards and Technology (NIST). This standard specifies STP to be 293.15 degrees Kelvin and 1 (Earth) atmosphere of pressure. Some common other terminologies for enthalpy are: heat of..., or latent heat of.... | |||
Enthalpy can be measured according to volume or mass or amount of substance; the SI units for measuring enthalpy are: joules per cubic meter (J/m<sup>3</sup>); gram (J/g); or, mole (J/mol). | |||
==Heat capacity== | |||
Heat capacity is the amount of heat needed to add to or remove from a material to produce a change in its temperature. The SI unit for measuring heat capacity is joule per kelvin (J/K). When we are also considering it in respect to volume, mass or amount of substance, it is measured using these units: | |||
* volumetric heat capacity (SI: J/K/m<sup>3</sup>) | |||
* specific heat [capacity] (SI: J/K/kg) | |||
* molar heat capacity (SI: J/K/mol) | |||
=Heat engines= | |||
'''[[Heat Engines]]''' | |||
=Onwards to practice= | |||
Nothing is perfectly efficient, not even thermal devices that operate on heat. The only exceptions are when you are maximizing heat generation, like with resistive heating (as with home radiators) or when you are moving heat around (you can actually exceed 100% efficiency with these devices, like air conditioners). From an engineering perspective, those device inefficiencies result in heat generation. To avoid breaking those devices as we mentioned in the introduction, a variety of heat management methods (such as spacecraft radiators, heat sinks and pumps...) are discussed in this article: | |||
==Heat Management== | |||
'''[[Heat Management]]''' | |||
=Additional reading= | |||
=References= | |||
=Credit= | |||
Authors: Luke Campbell, Rocketman1999, and Tshhmon | |||
[[Category:Physics & Engineering]] |
Latest revision as of 13:34, 23 April 2024
Intuitively, we have an inkling of what heat is. It's whatever goes into a cuppa coffee that makes it hot. It's what an ice cube lacks. It's the "stuff" that leaks out of the cuppa to cool it off, and into the ice cube to make it melt.
We have all probably also learned that heat has "energy", that not-exactly-substance quantity that can't be generated ex nihilo nor gotten rid of, but can be moved around and is both useful and necessary for getting stuff done.
So what separates heat and its energy from other stuff that has energy, like electricity or light or compressed springs?
Technically, in the science of thermodynamics, heat is defined as a flow of entropy along with its associated energy. In this sense, the coffee in the cup doesn't have heat as such – it has energy and entropy. But it can transfer that energy and entropy via a flow of heat to its environment.
Like energy, entropy can't be gotten rid of but it can be moved somewhere else. Unlike energy, you can make more entropy. In fact, this tends to happen spontaneously whenever you try do do things. So you end up with this stuff – entropy – building up in your system. If too much builds up, your system will break, so you need to get rid of it. In order to get rid of it, you need to move it as heat. And heat unavoidably also involves taking with it some of that energy that you would prefer to use for doing other things. For many applications, you want to figure out how to move as much entropy away as you can while giving up as little energy as possible.
For other systems, some of that entropy might actually be desired – melting steel or driving chemical reactions, for example. And then the job of a thermal management engineer might be figuring out how to manage the flow of energy and entropy into the system.
In this page, we will be exploring heat, how to move heat around, and how to manage energy and entropy flows.
Heat
Energy
The most basic definition of energy is the "ability to do work", where work is defined as applying a force over a distance. When you delve deeper into physics, the concept of energy becomes ever more fundamental – and trippy. Energy ends up being fundamentally intertwined with the flow of time, and the mere fact that the laws of physics stay the same over time mean that energy is conserved; you can never make it anew nor can you make it go away, but you can change it from one form into another and move it around. Energy also ends up distorting space and time around it, mass is just one form of energy, and other bizarre ideas.
But for the purposes of this article, the "ability to do work" is what we need, as well as the concept of conservation of energy. We need energy to do the stuff we want to do ("do work"), and we can't just make it up on the spot. So we either need to collect it from someplace else or bring it with us (as nuclear fuel, for example).
Entropy
Entropy is sometimes described as the amount of "disorder" in a system. Which is a useful if not exact way of looking at it. Fundamentally, entropy comes about because we see only large-scale averaged properties of things, and there are a very large number of ways of arranging all the individual microscopic particles that can give the same macroscopic observed properties. The more ways there are that you can put together all the atoms and electrons and quantum field fluctuations to give the same observed macroscopic state, the higher the entropy of that state.
For example, consider a million gas atoms in a box. You can measure the pressure they exert on the sides of the box, you know how many atoms are there, and you know their total energy. But you don't know exactly where each atom is, nor how that energy is distributed among the atoms (at any given time, one specific atom may have more energy than another specific atom, for example). The bigger the box, the more possible positions there area. The more total energy, the more ways the atoms have of distributing that energy. And if you add more atoms, you have even more ways of arranging things. All of these lead to an increase of entropy.
One of the things you can show from thermodynamics is that if you have entropy, you can't just make it go away. The best you can do is shuffle it off someplace else, where hopefully it won't get in the way of what you are trying to do. However, you can make entropy from nothing! If you merge two macroscopic things, or let them come into contact and exchange material or energy or other macroscopic stuff, you increase the possible number of ways that the stuff can be arranged between them compared to when the things were isolated, and their total entropy goes up.
Heat transfer and temperature
So if you have two systems and you bring them in contact, you can get the exchange of energy and entropy between them. On a microscopic scale, the particles bump into each other, allowing them to exchange energy and re-distribute the "disorder" or "randomness" (entropy) of the system.
But which way does the net flow of energy and entropy occur? We define the temperature based on the direction of heat flow – energy and entropy always flow from high temperature to low temperature. If there is no net flow of heat, the temperatures of the two systems are equal and they are said to be in thermal equilibrium.
If you go and do the physics with this definition, it turns out that the temperature will be proportional to the average energy of each way a microscopic particle in the system can continuously store energy. For example, in a gas of free atoms, the gas molecules can move independently along each of the three Cartesian axes, so the average kinetic energy of the gas atoms will be three times the temperature times the proportionality constant. If the gas is made up of molecules that can rotate, the independent ways of rotating also have the same amount of energy; if the atoms in the molecules can vibrate back and forth, these can also get energy from the temperature. Note that only works if you measure temperature in absolute units such as kelvins. Degrees of celcius or farenheit will not give correct results!
If heat flows into or out of a system at (approximately) constant temperature, the energy and entropy of the heat are related to the temperature by
So to move a given amount of entropy out of a system, the higher the temperature the less energy it takes to do so. Also note that if a given amount of heat energy flows from a system at one temperature to a system at a lower temperature, the entropy will increase, as it must.
Energy that flows into a system that is not associated with heat flow can be used to do work on that system.
Processes
Adiabatic (no heat flow, isentropic)
If you have a well-insulated system that you do work on, or if you just do the work fast enough that very little heat has a chance to escape, you approach the limit of zero heat transfer – this is called adiabatic (which I personally think is one of the most awesome obscure words in the English language). If you do adiabatic work on a gas that compresses it, for example, energy will flow into the system. Where is this energy stored? You have the same number of gas molecules, but more energy in them. So each molecule must have more energy on average. In other words, its temperature has gone up, even though no heat has flowed into the system! When you allow the gas to expand adiabatically, it will do work on the surrounding environment and its temperature will drop again, even though it still has no heat flow into or out of the gas. This is why in thermodynamics we don't say that a system "has" heat, only that heat can flow between two systems. The total energy that a system can have can come from a combination of heat flow and work, and either can suffice to bring the system to the same state.
Once you know about adiabatic expansion and contraction, it explains a lot of stuff. Ever wonder why the high mountains are colder than down on the plains? The part of Earth's atmosphere where weather occurs is well mixed, with turbulent plumes of sun-heated gas rising up as far as the stratosphere (where weather stops) and cooler currents sinking again. Heat transfer between air masses is slow compared to the motion of the air, so this mixing is adiabatic. As a plume of air rises, there is less weight on it from all the air above it, so it undergoes adiabatic expansion and its temperature drops. Plunging air masses are likewise adiabatically compressed, and thus gain temperature as they descend. So the higher up you go, the colder it gets.
Or suppose you want to start a fire, and all you have is a long piston. If you put a bit of tinder in the piston, seal it up with the plunger, and rapidly push the plunger down, the air can get so hot it ignites the tinder. In addition to starting camp fires, this is how Diesel engines work – fuel is squirted into a piston and compressed until it spontaneously ignites. No spark plug needed!
Because there is no heat flow during an adiabatic process, its entropy does not change. So sometimes adiabatic processes are called isentropic processes. In a quantum system, adiabatic can be extended to indicate that the system evolves continuously with the way that its current quantum state evolves, without any quantum transitions to other states.
Isothermal
An isothermal process is in some ways the opposite of an adiabatic process. Heat flow occurs so rapidly between the system and a very large external system (called a "bath" or "heat sink") that the system you are working on is able to keep nearly constant temperature.
Isobaric (constant pressure)
In an isobaric process, the pressure on the system is kept nearly the same throughout the process. The work on a system is the force exerted on it times the distance the force is exerted, which is equivalent to the pressure on a system times its change in volume. So for a system kept at constant pressure, it is easy to find the work done on the system, just multiply that pressure by the change in volume of the system.
Constant volume
A system kept at constant volume can do no work, nor have any work done on it. So all energy flows, in or out, must be as heat.
Methods of heat flow
Conduction
When an atom with a lot of energy bumps into an atom with less energy, on average the more energetic atom will lose energy in the collision and the less energetic atom will gain energy. When this happens on a large scale, you get a net flow of heat from the hot atoms bumping into the cold atoms and making the cold atoms move around more. This process is called conduction.
For a material with a temperature difference across it, with cross sectional area and length and thermal conductivity , the amount of heat flow in a given time is
This can be generalized to continuous materials with continuous changes across them by defining a heat flux as the heat flow per unit area per unit time in the direction of the flow, and the temperature gradient which is the rate at which temperature changes with distance in the direction of the fastest change.
Convection
If you have a substance with a high temperature and physically move it out of your system, you will have a flow of heat out of your system. If the high temperature substance enters your system, heat will flow in. Transfer of heat by moving stuff around is known as convection. Because convection usually happens with fluid flows, it is messy and complicated and not terribly easy to describe, with special cases and geometry dependence and dependence on whether the flow is laminar or turbulent and so on. When it happens with flows of granular materials, it gets even more complicated. In this review, we'll content ourselves with just noting that it happens, although you can usually approximate the rate of heat flow as proportional to a temperature difference.
Radiation
The various electromagnetic waves in the ambient environment (such as visible light, infrared light, microwaves, or x-rays) can interact with atoms, and thus you can transfer heat between the electromagnetic field and a system of atoms and other things. So hot things can radiate their heat away as electromagnetic waves, and cold things can be heated up by the ambient radiation around them. You have likely experienced this - go outside on a sunny day and you will feel the heat from the radiation of the approximately 6000 K temperature sun.
When a system is in thermal equilibrium with the electromagnetic field, it is said to behave as a black body. It gets this name because an object that absorbs all light that hits it will be in equilibrium with the electromagnetic field. A black body radiates across a wide range of frequencies, but the peak frequency will be at
So as temperature goes up the radiation is primarily emitted at shorter and shorter wavelengths – infrared at room temperature, but shifting ro dull red as temperature increases, then orange, yellow, white, and blue-white.
A black body at a temperature will radiate at an intensity of
where W/m²/K^4 is the Stefan–Boltzmann constant. That fourth power of temperature in there means that the power radiated per unit area goes up very quickly with increasing temperature – doubling the temperature gives sixteen times as much radiation!
Objects that are not black will radiate at a reduced power level. In general at a specific frequency of electromagnetic radiation , an object will reflect away a fraction , transmit through a fraction , and absorb a fraction . Together, these must all add up to one
and individually they are all between 0 and 1. The quantity is called the absorptivity, Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle r(\omega)} the reflectivity, and Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle t(\omega)} the transmissivity. For a black body, Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle a(\omega)=1} for all Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \omega} . If this is not the case, the power radiated away at that frequency is decreased by a factor of Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle a(\omega)} . If the absorptivity is roughly the same over the spectral range where the object is radiating, its radiated power is overall decreased by a factor of Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle a} .
Thermal properties
Enthalpy
Enthalpy is the sum of a system's internal energy and the product of its pressure and volume. When we consider phases of matter (solid, gas, liquid, et cetera) as thermodynamic systems we find that transitioning from a phase to another requires adding or removing energy from that material. This is also referred to as the enthalpy of a particular phase transition (e.g. enthalpy of vaporization; or, boiling.)
When we consider the process in reverse (e.g. the enthalpy of condensation), the enthalpy is also equal and opposite. These enthalpies also change with ambient conditions such as temperature and pressure; we often measure enthalpies at what's called "standard temperature and pressure (STP)". Of course, there are differing standards. In the United States, one common standard comes from the National Institute of Standards and Technology (NIST). This standard specifies STP to be 293.15 degrees Kelvin and 1 (Earth) atmosphere of pressure. Some common other terminologies for enthalpy are: heat of..., or latent heat of....
Enthalpy can be measured according to volume or mass or amount of substance; the SI units for measuring enthalpy are: joules per cubic meter (J/m3); gram (J/g); or, mole (J/mol).
Heat capacity
Heat capacity is the amount of heat needed to add to or remove from a material to produce a change in its temperature. The SI unit for measuring heat capacity is joule per kelvin (J/K). When we are also considering it in respect to volume, mass or amount of substance, it is measured using these units:
- volumetric heat capacity (SI: J/K/m3)
- specific heat [capacity] (SI: J/K/kg)
- molar heat capacity (SI: J/K/mol)
Heat engines
Onwards to practice
Nothing is perfectly efficient, not even thermal devices that operate on heat. The only exceptions are when you are maximizing heat generation, like with resistive heating (as with home radiators) or when you are moving heat around (you can actually exceed 100% efficiency with these devices, like air conditioners). From an engineering perspective, those device inefficiencies result in heat generation. To avoid breaking those devices as we mentioned in the introduction, a variety of heat management methods (such as spacecraft radiators, heat sinks and pumps...) are discussed in this article:
Heat Management
Additional reading
References
Credit
Authors: Luke Campbell, Rocketman1999, and Tshhmon