Práctica 3 - Termodinámica (2015)Pràctica Español
Práctica 3. Nota= 9.5
Vista previa del texto
Entropy and free energy
Concepción Fernández Blanco
Grau en Enginyeria Biomèdica
INTRODUCTION OF THE LABWORK
Before explaining the performance of this labwork, it is convenient to introduce some concepts related to
First of all, our experiment is deeply connected with Gibb’s free energy (measured by ΔG=ΔH–TΔS) and entropy
(S), which is a “measure of the disorder of the system” or “measure of progression until the equilibrium is reached”.
(An important fact is that entropy will be the same in reversible or irreversible processes, since it is a state function and does not depend on intermediate states).
Moreover, according to the second law of thermodynamics, an isolated system at constant conditions (of pressure, temperature…) and initially with all its particles at one side of the system, will tend to reach a disordered final state (since all systems tend to equilibrium) and therefore, increasing its entropy.
In our case, we will deal with the statistical mechanics perpective to determinate the physical state of our system* and study their macrostates and microstates.
*It consists in a box with 6 dice inside simulating ideal gas particles and considering only two dimensions (therefore we are considering that the dice is not floating nor rising from the bottom as the gas does), in which we will induce a rise in the entropy levels thanks to increasing the velocity of our box.
In order to calculate the exact value of the entropy in our system, we will use the next formula: This one is important to stablish a relation between entropy and microstates*: We define microstates as a detailed specification of a configuration of a microscopic thermodynamic system. They are easily calculated thanks to the next formula: As shown, microstates (M) are defined thanks to the number of macrostates (n) (which according to our system, are the number of the dice that have the same value).
Another formula we use: Which help us to calculate the temperature of the system thanks to the internal energy, which is the kinetic energy of the system defined by the next formula: *We are considering that the system has no potential energy at first, so the only energy we are considering is the kinetic energy, which is equal to the total internal energy.
As mentioned, Gibb’s free energy will be calculated in our labwork as a task required. We can define this energy as a thermodynamic potential that measures the "usefulness" or process-initiating work obtainable from a thermodynamic system at a constant temperature and pressure.
ΔG=ΔH–TΔS Where ΔH is the change in entalpy, T is the absolute temperature and ΔS represents the change in entropy METHOD According with the tasks required, we place all dice in the box, all of them with the same number (1). After that, we will proceed to move 20 times the system across a papersheet (297mm) and repeat this several times increasing the velocity (which will be given by displacemen 0.297m of lenght divided by the time needed to make the system move 20 times).
So, thanks to the velocity (which is equal to the whole system) we can calculate the total energy of each dice and then, apply the formula showed to calculate the temperature of each dice.
Moreover, we calculate the entropy: As showed, entropy at first is 0 since we did not apply enough energy (we did not move the system with enough velocity) to make the system reach a potential energy to turn and change the number showed in the dice (macrostates).
* As we can see, when no dice is moved, M=6!/6!=1.
Thus, there’s no change in entropy because ln1=0.
Morevover, we are asked to calculate Gibb’s free energy of the system, wich will be given by: ΔG=ΔH–TΔS , but we do not know ΔH, which is: ΔH=ΔU+PΔV (and ΔU=KbΔT, as said before).
As explained in the introduction, we consider that our system is perfectly isolated at constant conditions which means that temperature and volume will be always constant, therefore the variation of temperature as the variation of volume will always be equal to 0: (ΔT=0 => ΔU=0, ^ ΔV=0 ) => ΔH=ΔU+PΔV = 0 => ΔG= -TΔS So, in order to answer the final questions we first put the whole graphic with all data Other important facts showed in the graphic: As showed, the Gibb’s free energy has been calculated using the formula explained before, so is easily noticed that Gibb’s free energy in the first cases is equal to 0, since S=0, because of the relation between macrostates and microstates.
An important fact is that these processes will always be spotaneous because Gibb’s free energy is always negative.
Notice that the maximum level of entropy is reached when the system is completly disordered (no dice shows the same number as the others) (Question 1) The activation energy represents the energy needed to go from the ordered state to the disordered state. In the case of a dice, the energy required to do this will be potential energy (Ep) needed to go from state a) to state b).
This activation energy asked is also showed at this graphic. It is important to emphasize that the system has finally had enough energy to turn thanks to the temperature reached, and finally shows a value of entropy, (since the number showed in one dice has finally changed also thanks to the velocity). And since the kinetic energy is the energy considered, (as said before), its value is exactly the activation energy, as showed: The next graph required in the tasks shows entropy versus temperature, in which entropy suddenly increases to a very high level compared to the previous ones, which were equal to 0: The green point shows the entropy reached when the dice finally had its activation energy, evolving to disordered states.
Notice that in this exact point is where entropy suddenly increases.
(Question 2) In order to calculate the potential energy, it is convenient to use the next formula: Ep=mgΔh where Δh can be easily calculated thanks to trigonometric rules: h= sen45·9 => h = => Δh= h – 4.5mm = 1.86mm So, according to the formula: Ep= 0.001kg·9.81 ·1.86 mm=1.82· 4.5mm As we calculated before, we know that activation energy is equal to the kinetic energy, which is equal to 1.2065· So this proves that kinetic energy and potential energy do not match each other (Ep<Ek).
This could be explained because when the potential energy was calculated, we were assuming an ideal case, like it was a theorical case.
On the other hand, the kinetic energy is much more closer tan reality than the potential energy is, because it was applied in a non-ideal case (since the table where the experiment was done had a friction coefficient, so we had to apply more force tan if it had no friction).
According to this fact, the kinetic energy needed to change between states is larger than the needed potential energy because we have taken into account the presence of a friction force, while when potential energy was calculated, it was done only using the formula (totally ideal case).
Even if we considered a case where a friction force was applied in both cases, it would not be enough, since potential energy could not outbalance the friction force, so it could be necessary to take into account a new acceleration as an additional force.
After that, the value of the sum of potential force and the new force should be equal to the kinetic force.
Consequently, applying more force will make easier the change between states.
Then the friction force will be easier to overcome leading to a fastest way of reaching the activation energy.
(Context question) Before explaining the born of the concept of entropy, we will introduce some people who gave thermodinamics a solid base to start studying this measure of disorder: The analysis which led to the concept of entropy began with the work of Lazare Carnot (1803), who proposed that in any machine the accelerations and shocks of the moving parts represent losses of moment of activity, which means that in any natural proccess exists a disipation of energy.
20 years after, Sadi Carnot (Lazare’s son) made the analogy with of how water falls in a water wheel, explaining that exists a relation between work and heat. This was an early insight into the second law of thermodynamics.
In 1843, James Joule deduced the First law of thermodynamics thanks to his experiments which were related with heat and friction, and in which he explains the concept of energy.
However, it can not quantify the effects of friction nor dissipation.
So it was in the 1850s and 1860s when Rudolf Clausius first talked about entropy. He chose this word from Greek, en+tropein, is "content transformative" or "transformation content" ("Verwandlungsinhalt").
He proposed a summary of the first and second law of themodynamics, introducing then the first concept of entropy proposing that: - The energy of the universe is constant.
- The entropy of the universe tends to a maximum.
But before arriving at the most known name, Clausius presented for the first time the mathematical formulation of entropy, which he then called it as “equivalence-value”.
The description Clausius finally made for entropy (1865) was: the dissipative energy use of a thermodynamic system during a change of state.
Ludwig Boltzmann expanded this concept in 1877 thanks to his definition, where entropy is a measure of the number of possible microscopic states (or microstates*) of a system in thermodynamic equilibrium, consistent with its macroscopic thermodynamic properties (or macrostate*).
*(Where microstate is a specific microscopic configuration of a thermodynamic system that the system may occupy with a certain probability in the course of itsothermal fluctuations. In contrast, the macrostate of a system refers to its macroscopic properties, such as its temperature and pressure) So, according to his formula, the entropy of a system can be measured by: Which was used in this labwork.
Moreover, Josiah Willard Gibbs (together with Boltzman and Maxwell) has created statistical mechanics (a term that he coined), explaining the laws of thermodynamics as consequences of the statistical properties of large ensembles of particles.
He also introduced the idea of expressing the internal energy U of a system in terms of the entropy S, in addition to the usual state-variables of volume V, pressure p, and temperature T.
Finally, its is important to mention that it was Gibbs who first combined the first and second laws of thermodynamics by expressing the infinitesimal change in the energy of a system in the form: ...