A Macroscopic View of Chemical Processes: Deciding on
the Feasibility of a Chemical process.
The Second Law of Thermodynamics:
In any spontaneous change the entropy of the universe
increases.
DSuniverse
> 0 for a spontaneous change.
DSuniverse
= 0 for no change (i.e. equilibrium).
Note: DS = S2 - S1 = final state - initial state
Other statements of the Second Law:
All systems tend to approach a state of equilibrium.
Entropy is time's arrow.
Every system which is left to itself will, on average,
change toward a condition of maximum probability.
Things are getting more screwed up every day.
You can't break even.
Universe: everything.
System: The little bit of the universe we are
interested in.
Open System: A system which allows both matter
and energy to flow across its boundaries.
Isolated System: A system which allows nothing
to cross its boundaries.
Closed System: A system which allows energy
but not matter to cross its boundaries.
State Function: A measurement of the state of
the system which is independent of the history of the system (i.e. the
path by which the current state of the system was attained).
Examples of State Functions: E (potential energy); P (pressure);
V (volume); T (temperature); n (amount of matter); S (probability).
Functions which are not State Functions: t (time).
Extensive Functions (dependent on how much material
is present): E, V, n, S
Intensive Functions (independent on the amount of material present): P, T.
Physical significance: randomness; disorder; 'chaosness'
Second Law of Thermodynamics (The driving law for our
universe):
To determine if a chemical process will occur or not:
If positive process may occur.
If it is negative the process may occur in the opposite
direction.
If 0 the process is at equilibrium.
Randomness can be measured in terms of the number of ways
of organizing a system: the more ways, the more random.
Boltzmann (1896) produced a relationship for measuring entropy in terms of the number of ways of organizing a system, w:
Unfortunately, the number of ways in which the very large
number of particles of matter and the amount of energy present in a measurable
sample is impossible to estimate.
In practical terms we must find a measurable function
that is related to the ways of organizing a system.
However, the definition using a logarithmic expression
means that entropy is additive. If two systems are brought together their
total entropy will be the sum of their individual entropies.
Try it with two systems each with two ways of organizing:
Put them together and there are 4 ways of organizing them:
DSboth | = DSone + DStwo |
= k ln(2) + k ln(2) | |
= k ln(2*2) | |
= k ln(4) |
Entropy is additive, so consider the system and the rest of the universe:
Consider a closed system.
Important: all discussions of systems from here on are for closed systems only.
Only energy can cross its boundary. Thus, when the system
changes, only DE can affect DSsurrounds.
Some measurement involving E, the internal energy, might
give a measurement of S, the entropy.
Randomness of potential energy is possible because of
the quantization of energy. The effect of successively adding 1 unit of
energy to a system's randomness can be calculated in a simple case. (See
diagram on next sheet)
This example shows that the magnitude of the entropy change
when a constant amount of energy is added depends on the initial energy
content such that the higher the energy content, the lower the change
in S when more energy is added.
Another way of seeing the same effect is to consider the experimental observation:
"heat always flows from the hotter to the colder body."
Consider two identical pieces of matter, one at 290 K,
the other at 300 K. Bring these together and there is a spontaneous flow
of heat from the hotter to the colder until the two temperatures become
equal at 295 K.
Since this was spontaneous, DSuniv
> 0.
Using the additivity of entropy:
But energy conservation rules (First Law of Thermodynamics)
indicate that the energy change for each must be equal (though opposite
in sign).
Thus the same amount of energy change has caused a greater
change in entropy in the colder body (lower initial energy content) than
in the hotter (higher initial energy content).
2. The initial energy content depends on the temperature.
Thus the entropy change depends on DE
and 1/T.
For a practical measurement of entropy change we define
Substituting into:
(Why negative DE / T?)
Simplify the way this is written by removing the subscript
system: Whenever a change is written without a subscript it will be a change
in the system's function that is intended.
Focus for a while on the DE
term.
Measuring Energy Changes:
First Law of Thermodynamics:
Energy cannot be created or destroyed. It may be changed
from one form to another.
q is the heat added to the system.
w is the work done on the system.
Exercises:
1. Calculate DE if 500 J of
heat are added as the system does 1000 J of work.
2. A system loses 25.5 kJ of heat as it does 55.6 kj of
work. What is DE ?
Other ways of stating the First Law:
The energy of the universe is constant.
You can't get something for nothing.
There is no such thing as a free lunch.
Note that q and w are not state functions. When a system changes its energy, the q and the w depend on the path, though the total change in energy does not.
See the text for this.
In chemical systems we ignore work due to gravitational,
and macroscale magnetic and electrical changes.
The only work we include is that due to expansion/contraction of a gas. (Ignore solids and liquids.)
work done by system = -w
= force * distance moved
= (pressure*area) * distance moved
= pressure * (area*distance moved)
= pressure * (volume change)
= Pext * (V2 - V1)
= Pext * DV
So:
Note: If P is not constant, w = -òPdV
Measuring DE for a Chemical
System.
1. For a system at constant volume: (dV = 0): DE = qv
A measure of the heat change at constant volume directly
gives DE.
2. For a system at constant pressure: DE = qp - PDV
Thus a measure of the heat change at constant P gives:
Rearranging this latter equation: qp = DE
+ PDV
Definition: DH = DE
+ PDV = qp.
H is termed the enthalpy.
Enthalpy is the constant pressure energy term for a chemical
system.
Constant pressure experiments are much easier and considerably
less costly and dangerous to conduct than are constant volume experiments.
Thus H is used extensively for chemical systems.
Remember:
H is a constant pressure term which includes any pressure/volume (PV) work:
For qv (DE): use
a bomb calorimeter
For qp (DH): use
a "coffee cup" calorimeter.
Example 1.
The liquid in a bomb calorimeter was heated for 568 s with a 15.00-watt heater. The temperature rose in the calorimeter 2.00ºC. (1 watt = 1 J/s)
A 0.455-g sample of sucrose, C11H22O11, was burned in excess oxygen in the calorimeter; the temperature increased from 24.49ºC to 26.25ºC.
Calculate DE per mole for sucrose.
Example 2.
A 1.000-g sample of benzoic acid, 122 g/mol, DH(combustion) = -3227 kJ/mol, was burned in a "coffee-cup" calorimeter when a temperature change from 23.00ºC to 25.57ºC was observed.
A 2.000-g sample of citric acid, C6H8O7, was burned in the same calorimeter and the temperature changed from 23.00ºC to 24.66ºC was observed. Calculate DH and DE for the combustion of one mole citric acid:
2 C6H8O7 + 9 O2
Ž 6 CO2 + 4 H2O.
There is no known zero of energy. Absolute values for
energy cannot be determined.
Energy is a state function. So DE is independent of the path by which a process is carried out.
DEab = DEac + DEcb
DHab
= DHac
+ DHcb
(Hess's Law of Heat Summation)
If the state C is defined to be the same for all chemical
reactions, then it becomes possible to devise a method to tabulate values
of enthalpy.
C is best defined as the elements in their standard states.
(standard state is defined as the state at 298 K and 100
kPa.)
C may also be defined as the atomic elements in the gas phase at 298 K