This article will be permanently flagged as inappropriate and made unaccessible to everyone. Are you certain this article is inappropriate? Excessive Violence Sexual Content Political / Social
Email Address:
Article Id: WHEBN0000778700 Reproduction Date:
The four laws of thermodynamics define fundamental physical quantities (temperature, energy, and entropy) that characterize thermodynamic systems. The laws describe how these quantities behave under various circumstances, and forbid certain phenomena (such as perpetual motion).
The four laws of thermodynamics are:^{[1]}^{[2]}^{[3]}^{[4]}^{[5]}^{[6]}
There have been suggestions of additional laws, but none of them achieve the generality of the four accepted laws, and they are not mentioned in standard textbooks.^{[1]}^{[2]}^{[3]}^{[4]}^{[5]}^{[7]}^{[8]}
The laws of thermodynamics are important fundamental laws in physics and they are applicable in other natural sciences.
The zeroth law of thermodynamics may be stated in the following form:
If two systems are both in thermal equilibrium with a third then they are in thermal equilibrium with each other.^{[1]}
The law is intended to allow the existence of an empirical parameter, the temperature, as a property of a system such that systems in thermal equilibrium with each other have the same temperature. The law as stated here is compatible with the use of a particular physical body, for example a mass of gas, to match temperatures of other bodies, but does not justify regarding temperature as a quantity that can be measured on a scale of real numbers.
Though this version of the law is one of the more commonly stated, it is only one of a diversity of statements that are labeled as "the zeroth law" by competent writers. Some statements go further so as to supply the important physical fact that temperature is one-dimensional, that one can conceptually arrange bodies in real number sequence from colder to hotter.^{[9]}^{[10]}^{[11]} Perhaps there exists no unique "best possible statement" of the "zeroth law", because there is in the literature a range of formulations of the principles of thermodynamics, each of which call for their respectively appropriate versions of the law.
Although these concepts of temperature and of thermal equilibrium are fundamental to thermodynamics and were clearly stated in the nineteenth century, the desire to explicitly number the above law was not widely felt until Fowler and Guggenheim did so in the 1930s, long after the first, second, and third law were already widely understood and recognized. Hence it was numbered the zeroth law. The importance of the law as a foundation to the earlier laws is that it allows the definition of temperature in a non-circular way without reference to entropy, its conjugate variable. Such a temperature definition is said to be 'empirical'.^{[12]}^{[13]}^{[14]}^{[15]}^{[16]}^{[17]}
The first law of thermodynamics may be stated in several ways:
More specifically, the First Law encompasses several principles:
where Q denotes the amount of energy transferred into the system as heat.
where u_{external} denotes the internal energy per unit mass of the transferred matter, measured when it is still in the surroundings, before transfer; and ΔM denotes the transferred mass.
Combining these principles leads to one traditional statement of the first law of thermodynamics: it is not possible to construct a machine which will perpetually output work without an equal amount of energy input to that machine. Or more briefly, a perpetual motion machine is impossible.
The second law of thermodynamics asserts the irreversibility of natural processes, and the tendency of natural processes to lead towards spatial homogeneity of matter and energy, and especially of temperature. It can be formulated in a variety of interesting and important ways.
It implies the existence of a quantity called the entropy of a thermodynamic system. In terms of this quantity it implies that
When two initially isolated systems in separate but nearby regions of space, each in thermodynamic equilibrium with itself but not necessarily with each other, are then allowed to interact, they will eventually reach a mutual thermodynamic equilibrium. The sum of the entropies of the initially isolated systems is less than or equal to the total entropy of the final combination. Equality occurs just when the two original systems have all their respective intensive variables (temperature, pressure) equal; then the final system also has the same values.
This statement of the law recognizes that in classical thermodynamics, the entropy of a system is defined only when it has reached its own internal thermodynamic equilibrium.
The second law refers to a wide variety of processes, reversible and irreversible. All natural processes are irreversible. Reversible processes are a convenient theoretical fiction and do not occur in nature.
A prime example of irreversibility is in the transfer of heat by conduction or radiation. It was known long before the discovery of the notion of entropy that when two bodies initially of different temperatures come into thermal connection, then heat always flows from the hotter body to the colder one.
The second law tells also about kinds of irreversibility other than heat transfer, for example those of friction and viscosity, and those of chemical reactions. The notion of entropy is needed to provide that wider scope of the law.
According to the second law of thermodynamics, in a theoretical and fictional reversible heat transfer, an element of heat transferred, δQ, is the product of the temperature (T), both of the system and of the sources or destination of the heat, with the increment (dS) of the system's conjugate variable, its entropy (S)
Entropy may also be viewed as a physical measure of the lack of physical information about the microscopic details of the motion and configuration of a system, when only the macroscopic states are known. The law asserts that for two given macroscopically specified states of a system, there is a quantity called the difference of information entropy between them. This information entropy difference defines how much additional microscopic physical information is needed to specify one of the macroscopically specified states, given the macroscopic specification of the other - often a conveniently chosen reference state which may be presupposed to exist rather than explicitly stated. A final condition of a natural process always contains microscopically specifiable effects which are not fully and exactly predictable from the macroscopic specification of the initial condition of the process. This is why entropy increases in natural processes - the increase tells how much extra microscopic information is needed to distinguish the final macroscopically specified state from the initial macroscopically specified state.^{[18]}
The third law of thermodynamics is sometimes stated as follows:
At zero temperature the system must be in a state with the minimum thermal energy. This statement holds true if the perfect crystal has only one state with minimum energy. Entropy is related to the number of possible microstates according to:
Where S is the entropy of the system, k_{B} Boltzmann's constant, and Ω the number of microstates (e.g. possible configurations of atoms). At absolute zero there is only 1 microstate possible (Ω=1 as all the atoms are identical for a pure substance and as a result all orders are identical as there is only one combination) and ln(1) = 0.
A more general form of the third law that applies to a systems such as a glass that may have more than one minimum microscopically distinct energy state, or may have a microscopically distinct state that is "frozen in" though not a strictly minimum energy state and not strictly speaking a state of thermodynamic equilibrium, at absolute zero temperature:
The constant value (not necessarily zero) is called the residual entropy of the system.
Circa 1797, Count Rumford (born Benjamin Thompson) showed that endless mechanical action can generate indefinitely large amounts of heat from a fixed amount of working substance thus challenging the caloric theory of the time, which held that there would be a finite amount of caloric heat/energy in a fixed amount of working substance. The first established thermodynamic principle, which eventually became the second law of thermodynamics, was formulated by Sadi Carnot during 1824. By 1860, as formalized in the works of those such as Rudolf Clausius and William Thomson, two established principles of thermodynamics had evolved, the first principle and the second principle, later restated as thermodynamic laws. By 1873, for example, thermodynamicist Josiah Willard Gibbs, in his memoir Graphical Methods in the Thermodynamics of Fluids, clearly stated the first two absolute laws of thermodynamics. Some textbooks throughout the 20th century have numbered the laws differently. In some fields removed from chemistry, the second law was considered to deal with the efficiency of heat engines only, whereas what was called the third law dealt with entropy increases. Directly defining zero points for entropy calculations was not considered to be a law. Gradually, this separation was combined into the second law and the modern third law was widely adopted.
Thermodynamics, Energy, Statistical mechanics, Temperature, Thermodynamic system
Statistical mechanics, Temperature, Entropy, Energy, Pressure
Thermodynamics, Ideal gas, Statistical mechanics, Entropy, James Clerk Maxwell
Thermodynamics, Heat, Energy, Temperature, Chemical thermodynamics
Laws of thermodynamics, Thermodynamics, Chemistry, Energy, Heat
Mathematics, Electromagnetism, Statistical mechanics, Physical chemistry, Laws of thermodynamics
Stephen Hawking, Quantum mechanics, Laws of thermodynamics, Quantum gravity, Hawking radiation