# Statistical Mechanics Lecture 2 | Summary and Q&A

170.2K views
April 24, 2013
by
Stanford
Statistical Mechanics Lecture 2

## Summary

In this video, the speaker discusses the Boltzmann constant and temperature, explaining their definitions and connections. He describes how temperature is a derived quantity and introduces the concept of entropy. The speaker also explains the relationship between energy and temperature, as well as the monotonically increasing functions of energy and entropy. He then defines temperature in terms of energy and entropy, showing how they are related.

### Q: What is the connection between the Boltzmann constant and temperature?

The Boltzmann constant, like other constants in physics, is a conversion factor. It helps us convert between human units and more fundamental units. In the case of temperature, the Boltzmann constant acts as a conversion factor from human units to energy units. Temperature is related to the kinetic energy of molecules in a gas, and the Boltzmann constant allows us to measure and compare temperature in terms of energy.

### Q: Why is temperature typically measured in Kelvin units or centigrade?

Temperature is measured in Kelvin units because it is relative to absolute zero. Centigrade is another term for Celsius, and it is also relative to the freezing point and boiling point of water, which are arbitrary choices. Kelvin and Celsius scales are related by a linear equation, making them equivalent. The choice of water as the reference point for temperature is convenient, but it could have been any other substance.

### Q: What are human units of temperature?

Human units of temperature are units that were invented for convenience in measuring and understanding temperature. They are easy to measure with human-scale thermometers that are made up of many atoms. The degree Kelvin is a human construct invented for convenience in measuring temperature changes. It is a reference point for energy and a unit of temperature change that can be easily manipulated and understood.

### Q: Why is temperature a measure of energy?

Temperature is a measure of energy because the temperature of a gas, for example, determines the kinetic energy of its molecules. The temperature of a gas can be viewed as the average kinetic energy of its molecules. Although different types of molecules may have different kinetic energies, in thermal equilibrium at a fixed temperature, all atoms have roughly the same kinetic energy. Thus, temperature is a measure of the amount of energy present in a system.

### Q: How is temperature defined in terms of energy and entropy?

Temperature is defined as the rate of change of energy with respect to entropy. It represents the amount of energy required to change the entropy of a system by a certain amount. This relationship can be expressed as de = T * ds, where de is the change in energy, ds is the change in entropy, and T is the temperature. Alternatively, it can be written as ds = 1/T * de. This equation shows that temperature is inversely proportional to the change in entropy with respect to energy.

### Q: Why does the Boltzmann constant cancel out in the definition of temperature?

The Boltzmann constant cancels out in the definition of temperature because it is included in both the energy and entropy terms. In the definition de = T * ds, the Boltzmann constant appears in both the energy and entropy parts. When converting to more fundamental units, such as joules for energy, the Boltzmann constant cancels out, leaving only the units of temperature in the equation. This cancellation is why the Boltzmann constant is not seen in temperature-related formulas in standard thermodynamics textbooks.

### Q: How does the entropy of a system change with the temperature?

The entropy of a system changes with temperature because it is directly related to the energy and the probability distribution of the system's states. When the temperature increases, the probability distribution of the states spreads out, leading to a higher entropy. Conversely, if the temperature decreases, the probability distribution becomes more focused, resulting in a lower entropy. This relationship between temperature and entropy allows us to understand how energy and randomness are connected.

### Q: What happens when two systems of different temperatures are connected?

When two systems of different temperatures are connected, the energy will flow from the hotter system to the cooler system. This is a fundamental concept of temperature and is known as heat transfer. In thermal equilibrium, the rate of energy exchange between the systems becomes equal, and the two systems reach a stable, balanced state. The direction of energy transfer is determined by the difference in temperatures, with energy flowing from the higher temperature system to the lower temperature system.

### Q: What does it mean for entropy to be additive?

Entropy is additive because it is a logarithmic measure of probabilities, and probabilities multiply. When two systems are combined, the total entropy of the combined system is the sum of the entropies of the individual systems. This additivity property allows us to analyze the entropy of complex systems by considering the entropies of their individual components. It also implies that the total entropy of an isolated system, which does not exchange energy or matter with its surroundings, will always increase over time.

### Q: How does the change in energy relate to temperature and entropy when two systems equilibrate?

When two systems equilibrate, the change in energy is related to the temperature and entropy changes of each system. In the case of system A, the change in energy is equal to the temperature of system A times the change in entropy of system A. Similarly, for system B, the change in energy is equal to the temperature of system B times the change in entropy of system B. This relationship shows the close connection between energy, temperature, and entropy in describing the behavior of systems in thermal equilibrium.

## Takeaways

Temperature is a derived quantity that measures the rate of change of energy with respect to entropy. It provides a measure of how much energy needs to be exchanged to change the entropy of a system by a certain amount. The Boltzmann constant is involved in the conversion between human units and more fundamental energy units. The relationship between temperature and entropy allows us to understand heat transfer and the equilibration of systems at different temperatures. Additionally, entropy is additive, and the change in energy during equilibration depends on the temperatures and entropy changes of the individual systems.