# The Entropy in the Ensemble of Coupled Pigments

As mentioned shortly in the last chapter the rate equation formalism as e.g. given by eq. 4 and 38 (see chap. 1.3, 1.4 and 2.2) delivers a thermodynamic approach to excited states that migrate in systems of coupled pigments. In the equilibrated case the probabilities of excited state populations follow the Boltzmann distribution:

In the dissipative (nonequilibrium) situation the solution of eq. 38 for the time dependent excited state population can be used to estimate the entropy dynamics:

In the following eq. 72 will be shortly evaluated. For that purpose we choose the most simple system in this context, i.e. two coupled states as shown in Figure 72. In the initial moment only the energetically higher state is excited and after excitation the system relaxes. For the sake of simplicity we assume that the excited states cannot decay into the ground state. This is a good approximation for strongly coupled systems where the energy transfer processes and the thermal equilibration occur much faster than the excited state relaxation.

**Figure 72. **Two coupled excited states which are separated by AE = 10 meV. The energy transfer from state one to state two has a probability of (2 ps)^{-1}. The back transfer probability follows the Boltzmann distribution.

With the formalism described in eq. 39 in chap. 2.2 one can calculate the excited state population of the system shown in Figure 72. Then eq. 72 can be used to calculate the time dependent entropy of the system shown in Figure 72. While a numerical solution of the problem is possibly independent from the complexity of the coupled system the problem given in Figure 72 can be solved analytically and the result denotes to:

For that system the entropy according to eq. 72 takes the form

This entropy function has a maximum of S^{max} = *k _{B}* ln2 for the equal population probability of

*N*^t) =

*N*(t):

_{2}The time **t**^{max}, when this maximum is reached denotes to:

For AE ^ oo or * T* = 0 we get

While for AE = 0 or * T —>* oo the entropy maximum is coincident with the equilibrium

**N**_{1}(t) =

*(*

**N**_{2}*) for*

**t**

**t**^{max}.

That means that * s(t}* exhibits a local maximum for all cases where AE > 0 л

*< oo which is the case for all relaxations that correspond to time directed (dissipative) processes.*

**T**The time dependent excited state population for N^t) and N_{2} (t) calculated according to eq. 39 for 300 K and the entropy curves * s(t}* at different temperatures are shown in Figure 73.

Figure 73. Time dependent population at 300 K of Ni(t) (red curve, left side) and N_{2}(t) (green curve, left side) and calculation of the entropy of the system shown in Figure 72 according to eq. 72 at 10.000 K (red curve, right side), 300 K (green curve, right side), 100 K (yellow curve, right side), 70 K (light blue curve, right side), 30 K (dark blue curve, right side) and 1 K (magenta curve, right side).

As expected for the high temperature limit the entropy rises monotonously to the maximum that is reached with the equilibrium of the system (Figure 73, right side, red curve for 10.000 K).

The situation is different at lower temperatures. At room temperature (Figure 73, right side, green curve for 300 K) the system entropy reaches the maximum in the time scale near to the inverse transition probability (about 2 ps) fast but decays afterwards to a somewhat lower level.

For very low temperatures the entropy calculates to lim*s**(**t **—>* oo) —> 0 .

The difference of the maximal entropy S^{max} and the entropy *s(t **—>* oo) := S^{mf} in dependency on the temperature is shown in Figure 74, left side. While this difference is rather low at physiological temperatures the relaxation to the thermal equilibrium leads to a strong reduction of the system entropy in comparison to S^{max} at lower temperatures. The system generally goes through an entropy maximum and afterwards the entropy decays significantly to S^{mf} < S^{max} if *k _{B}T* < AE . Therefore a local entropy maximum is observed if AE >

*k*.

_{B}TFigure 74. Left side: Temperature dependent difference of the maximal entropy S^{max} = k_{B} ln 2 and the entropy after full relaxation S^{inf} calculated for the system shown in Figure 73. Right side: schematic cartoon how photons and phonons transfer entropy to the local environment during relaxation of the system shown in Figure 72.

At first glance the local maximum of the entropy function at temperatures *AE> k _{B}T* might look as if the system given in Figure 72 could violate the second law of thermodynamics. But such a violation is surely not the case. In fact it should be kept in mind that a relaxation of the pure isolated system as given in Figure 72 could not occur if there would not exist surrounding states that are able to dissipate AE.

The environment effectively takes up the value -^{1} _{AE} AE during

1 + *e ^{квТ}*

the dissipation process. This leads to a rise of environmental entropy that is bigger than the reduction of the isolated system’s entropy during the transition S^{max} *—>* S^{mf}. This preserves the second law of thermodynamics and it becomes clear that the system complexity could never arise if this system would not be able to interact with its environment. There is no violation of the laws of thermodynamics. Jennings mentioned that if we analyze a photosynthetic system from the lowest energy limit, we could observe a violation of the second law of thermodynamics (Jennings et al., 2005, 2006, 2007).

This might be true for a single photon of 680 nm wavelength that is absorbed by a plant and drives a single quantum process in the photosynthetic nanomachine. But this is neither the continuous reality of a growing plant nor is it a process that is forbidden due to the thermodynamic laws as long as it counts for a single absorbed photon only which can violate the second law of thermodynamics according to the Jarzynski equality which denotes the probability for a trajectory violating the second law of thermodynamics similar to eq. 8 (see e.g. ref. Crooks, 1998) and references therein for details). The second law of thermodynamics is a pure statistic interpretation of ensembles. It is not a law that can be applied to a single quantum process. As denoted by the Jarzynski equality a single molecular process is allowed to violate the second law of thermodynamics in a transient way. In the time- or the ensemble average the laws of thermodynamics hold.

The calculation of the entropy as given in eq. 72 for the nonequilibrium ensemble enables a suggestion of a nonequilibrium partition function that would enable the calculation of all thermodynamic variables for a full nonequilibrium situation in systems that can be generally described by excited state probabilities. With eq. 73 all the nonequlibrium observables of the ensemble like the time dependent temperature *T(t),* the time dependent Gibbs energy *G(t)* etc... can be calculated. However, we used the denotation of the entropy as given in the eq. 71 that is only valid for the equilibrium. To achieve an nonequilibrium situation the expectation value of the energy and the “momentary” temperature T(t) would

have to be treated time dependently: