entropie entropie_txh_en_som

Entropy txh

back to the home page

version available in : it ru zh es ja de nl pt fr

date of creation : 20240724- date of update : 20240724- generation date : 20240908_053307

1.Entropy

Entropy characterizes the increase in complexity of a system, i.e. the amount of information that is necessary to describe it (solve).

We consider at the beginning a closed system, that is to say a system which exchanges neither energy (mechanical, thermal, radiation)nor matter with its environment.

Entropy is linked to the notion of disorder : the more information is needed to describe a system, the more disordered it may appear to us.

In fact, entropy is a concept of physics linked to the precise measurement of the notions of order or disorder.

Entropy also measures the capacity to evolve for a closed system with a certain usable energy (because it has a low entropy).

If you have a team in which the horses go in all directions you cannot move forward because there is no order.

If you discipline your horses then their ordered energy becomes usable to advance the team.

This example shows that the energy (the strength that horses can mobilize to perform a certain job) is not enough and that there is also a notion of order which is very important.

It is the concept of entropy that is linked to this order or disorder.

Energy is what allows evolution, change.

If a car's gas tank is empty, it can't go any further, unless it uses the potential gravitational energy it possesses or the kinetic energy it has accumulated on a slope, which will allow you, if you're lucky, to get to the next gas pump..

If the entropy is low (orderly system) the system can evolve.

As it evolves, entropy increases, i.e. the energy is degraded (decreases its order) and the system is less able to evolve.

For example, horses will consume the chemical energy contained in the food they have eaten by using oxygen from the air through respiration to produce ATP moleculesAdénosine_triphosphate which is the fuel of the physiology.

By pulling the hitch they will consume this fuel to make the muscles work (the engine).

When the energy of all the food will have been consumed with possibly the reserves stored in their muscles their liver and their fat then they will be tired, will have to rest and if they do not eat again (closed system) they will not be able to continue to pull the hitch for long, if not until they run out (energy reserves).

To consume energy is in fact to increase entropy because we never consume energy because of the law of conservation of energy which is the first principle of thermodynamics: in a closed system the energy is constant.

When energy is dissipated or degraded, its entropy and therefore its disorder increases (it is the second principle of thermodynamics).

In the example of horses, the horse crud resulting from digestion is less ordered than the herbs from which it is derived.

It is the energy of the weakly entropic sun which by photosynthesis allows the grasses to grow back by using the organic matter of the crotin.

Entropy is a concept initially introduced in macroscopic thermodynamics by Clausius.

Rudolf_Clausius and whose deep meaning in terms of information was clarified much later in statistical mechanics by Boltzmann.

The second principle of thermodynamics says that in a closed system, entropy can only increase or, at the limit, remain constant.

Order and disorder are of fundamental importance in physics which deals with the laws of operation of physical systems composed of a very large number of entities (a gas formed by the whole of its molecules for example).

This physics is called theThermodynamique.

The large numbers make new properties, new concepts, new realities and experiences appear.

Entropy was then redefined by Shannon in the framework of the information theory where entropy is identified with the quantity of information.

( Information theory is the basis of computer science so entropy must play an important role in this field cf . entropie et informatique  .Shannon nweb)

1.1.Entropy and information

Entropy and information are strongly related concepts and can be considered as identical in mécanique_statistique_en.

Indeed, the more complex a system is, the greater its entropy and the more information is needed to describe it.
For example, the same quantity of matter in the form of gas or in the form of a crystal is not described with the same amount of information.

If the crystal is perfect (without gaps, dislocation, etc..entropy and information) then it is enough to specify the position of an atom of the crystal and the structure of the crystal lattice to know where all the atoms of the crystal are.

We therefore need very little information to describe the system.

In such a system the entropy is very low.
On the other hand for the gas as there is no bond between the atoms these must be described individually if we want to know the exact state of the system.
The amount of information is enormous, linked to the Avogadro number 6.022 10^23 and the entropy is very large.

It was then shown that Shannon's definition and the thermodynamic definition were equivalent.

To complete the picture, just as a remark, it should be mentioned that following the work of physicists Bekenstein and Hawking a new form of entropy has appeared in the dynamics of black holes.

This entropy led to the holographic principle of T'hooft.

This principle considers that the smallest unit of physical information is a surface the size of the Planck length squared (Planck surface).

The Planck length being the smallest physical length below which the notion of length loses its meaning (quantum uncertainty).

One could thus say that the Planck length is the archetypal length and that two points of the physical space immediately adjacent are separated by this length.

There can be no physical space points between these two points.

One can of course conceive lengths inferior to the Planck length but they are no longer physical lengths, they are abstract, mathematical lengths.

1.2.Entropy is related to the observer

In the mathematical formula of entropy there are two terms that tend to cancel each other, we can say that entropy is the logarithme_en of a relationship between two opposing notions: These two terms represent on the one hand the capacity of perception of the observer and on the other hand the complexity of the perceived system.

  • The complexity of the system is translated by the quantity of what there is to perceive and it is the number of possible states for a system (equiprobable states: the system can be in any of these states).

  • the capacity of perception is the resolution of our perception, the precision of our observation.

    Its finitude ensures a relative indistinguishability between states, ...

    not too different.

    The greater the number of indistinguishable states, the greater the disorder, we will not be able to "resolve" the real state since our capacity of perception is limited.

    The greater our capacity of perception, the more we will be able to discriminate the possible states of the system.

    So we will have more information about the system and we will consider it more organized.

    The important point is that the observer must be taken into account.

    The order of the system is therefore dependent on the observer.

    In physics we always consider the position of the observer in relation to the observed phenomenon.

    In mechanics this position is defined by the distinction between the observer's frame of reference and the frame of reference of the system itself and their relationship.

    1.2.1.Statistical thermodynamics or mécanique_statistique_en

    In statistical thermodynamics the position of the observer is not a position in space but rather a position in the scales of magnitude.

    The human observer is positioned at the scale of the meter while for a gas which is the main object of statistical thermodynamics the scale of the system is composed of atom whose quantum structure is located at the atomic scale that is to say a little below the nanometric scale (10^-10 m , 10 power minus 10 meter)

    Let's consider a simple example that is on a scale close to the human observer, i.e. between the centimeter and the decimeter: objects on a desk.

    The person working on this desk has a given level of information about the state of his or her desk.

    His office can be modeled by a set of boxes allowing to locate an object.

    The objects are stored in boxes.

    If there are N boxes and N objects with one object per box, then there are N! (factorial of N) possible states for the system, each object being able to occupy each of the boxes.

    In fact, to store the first object we have N possible boxes but for the second N-1 boxes etc..

    So the number of choices of arrangement of the N Objects in the N Boxes is N * N-1 * N-2 * .... * 2 * 1 = N! i.e. the factorial of N.

    Let's consider a particular state of office storage.

    Let's suppose that the person knows perfectly the state of the office, that is to say that he knows where to find each object by a direct access to the box which contains it.

    So in this case we can say that for this person the office is totally tidy, the order is perfect and therefore the disorder (or entropy) zero.

    Suppose another person has less knowledge of the state of the office.

    This person knows "about" where an object is but will have to make several attempts to actually find a given object.

    That is, it will have to spend some energy (degrade energy) and time to "reconstitute" the information that is missing.

    If in two or three tries she can find a given object.

    We can say that she does not have a perfect knowledge of the state of the system or that for her the system is slightly disordered.

    For this person, the disorder or entropy is not zero.

    Let's assume a third person is a complete stranger to this office and therefore has no information about its status.

    To find an object this person will have to open successively all the boxes until he finds the object sought.

    For this person the disorder or entropy is maximum.

    The entropy is then given by the factorial of N, N! which is a good representation of the complexity of the system, i.e. the complexity of understanding the system.

    The factorial is constructed as follows, at the first choice of a box (draw) there are N possibilities, in the second there are N-1 possibilities, in the third N-2 etc.

    So the complexity of the system can well be represented by the number N * (N-1) * (N-2) .... 3 * 2 * 1 which is precisely the factorial of N.

    And again in this example we assume that the observer is intelligent enough to be able to memorize (capitalize on its experience with the system) and not to search in a box that he has already opened and that did not contain the desired object.

    Otherwise the complexity of the system would be perceived by the observer as much greater.

    One could say that the complete system : system would be in a state of confusion.

    To be exact, entropy is associated with the mathematical logarithm of the factorial.

    S = log (N!statistical thermodynamics) This is because of the property of the logarithme_en : "The logarithm of a product is the sum of the logarithms of the two factors of the product.

    " In terms of mathematical formulas we have: Log( a * b ) = Log(a) Log(b) If we double the size of a system M = N * 2, we take two offices for example then the entropies are added linearly but the complexity increases exponentially.

    Modèle_des_urnes_d_Ehrenfest Formule_de_Boltzmann We can notice in the example of the office that as soon as the person starts to open boxes, that is to say to interact with the system, he will acquire information on this one which he will then be able to memorize.

    Consequently the entropy of the system will evolve for her but she will have to degrade energy elsewhere to be able to decrease this entropy.

    So it is a displacement of entropy because in the global system considered as closed the entropy can only increase (it is the second law of thermodynamics).

    In physical systems such as gases, observers who are beings characterized by a given type of perceptive organs (human beings for example) do not differ in their ability to perceive.

    This is what gives the laws of thermodynamics their objectivity.

    There is an illustration of this through the paradox of the . démon de Maxwell  .

    . Maxwell  imagined the following experiment: two boxes are placed against each other with a possible communication between them.

    At the beginning the communication is closed and one of the boxes is filled with a gas while the other is empty.

    If we open the communication, molecules will cross and end up in the other box.

    Statistical thermodynamics tells us that the most probable state for this system is the one where there is about the same number of molecules in both compartments.

    This is the state of thermodynamic equilibrium.

    It should be noted that the molecules are not subjected to any force that causes them to pass into the second box.

    A molecule can as easily go from the first box to the second as the opposite and this is what happens at any time.

    This is why the molecules are evenly distributed between the two compartments.

    If at a given moment there are more molecules in a compartment then the probability that the molecules go to the other box becomes greater too, hence the equilibrium.

    Maxwell's demon paradox is the idea that a small and extremely fast demon would have the ability to close the communication and would choose to open it only when there are more molecules coming to cross in one direction than in the other.

    This demon would create a dissymmetry in the distribution of molecules and would be able to empty one box to fill the other.

    This demon knows how to discriminate the exact state of the system because it works at the microscopic level of the molecules.

    The resolution of such a paradox is that Maxwell's demon cannot exist.

    In the case of thermodynamic physics this is indeed the case, but one can very well imagine for much less complex systems different capacities of perception as in the example of the office.

    Because of the above, in thermodynamics we neglect the "information perception capacity" aspect and focus mainly on the "quantity of information".

    But it is an approximation.

    This approximation is no longer valid if we are dealing with a system that can present differences in perception for human beings.

    The notion of entropy appeared around the particular context of a physical system that dissipates energy to reach the thermodynamic equilibrium where entropy is at its maximum.

    For example, an enclosure thermally insulated from the outside containing two compartments of equal size, in contact, filled with water where the temperature of the water is 0° for one compartment and 100° for the other will evolve naturally by virtue of the second principle of thermodynamics towards a situation where the temperature in both tanks is 50°..

    Thermal energy is the mechanical energy of molecules.

    The molecules exchange energy via the shocks on the wall which separate the two tanks with the same mechanism as before and we thus arrive at the macroscopic uniformity.

    This last state is stable and if no external energy intervenes.

    There will be no evolution of the system (if we consider the system perfectly sterile from a biological point of view because the life could exploit the thermal energy contained in the enclosure to organize the matter).

    In the 1960s, Ilya Prigogine, who received the Nobel Prize in 1977 for his work, was interested in what happens to a system when it is permanently maintained far from the state of thermodynamic equilibrium by a permanent flow of energy.

    He then observed that the dissipation of energy causes the appearance of an order in the matter what he named the dissipative structures.

    Dissipative structures are natural structures that possess the property of self-organization: "Self-organization is a phenomenon of increasing order, and going in the opposite direction of the increase of entropy (or disorder , symbol S) ; at the cost of energy dissipation that will be used to maintain this structure.

    It is a tendency of physical processes or living organisms, as well as social systems, to organize themselves ; we also talk about self-assembly.

    After a critical threshold of complexity, systems can change state, or go from an unstable phase to a stable phase.

    " The fundamental law of such systems can be summarized in a simple equation d²S=0 established by the Nobel Prize in Physics Ilya Prigogine which means that a self-organizing system evolves by creating a minimum of disorder as its complexity increases.

    For a living organism, the non-respect of this law results in a return to thermodynamic equilibrium, which is death for it.

    1.3.. analyse perception et entropie : study of perception modeling

    In computer systems the level of complexity is low enough that the perception of the state of the system is different for different observers..

  • for the first person, an observation or interaction with the system results in the direct opening of the box that contains the sought object.

    The probability of finding the object is total, i.e. 1.

    The resolution with which the person solves the state of the system is as fine as possible, i.e. the person is able to discern for sure the box he is looking for.

    We can represent the resolution by the ratio 1 / N .

    1 for the number of trials necessary to find the object to search for, i.e. to solve the state of the system.

    N for system complexity.

    We can represent the precision as the inverse of the resolution.

    The precision will be here N / 1 = N .

  • for the second person who has to open say n=5 boxes on average to find the object, the resolution will be 5 / N and the accuracy N / 5.

  • The accuracy can be defined by the ratio between the complexity of the system itself which is N and the complexity of the system seen by the observer, here n=5, since it needs to make 5 attempts (measures) to reduce the state of the system (that the real state of the system and the mental simulation of the system state are identical).

  • Accuracy appears to be the ability to perceive.

  • increasingly large: making 5 attempts to find an object among 6 is not very efficient (low probability) for 1 million objects it is extremely efficient (probability close to 1).

  • the probability of finding the object is the inverse of the average number of attempts to find the object.

    1 for the person who knows the system completely.

    1 / 5 in the case of the second person 1 / N in the case of the person who is completely unaware of the system status.

    nbr of boxesresolution(precision(nbr/5)  probalility (nbr-5/nbr) 
    6 1.2  0.000000
    7 1.4  0.000000
    8 1.6  0.000000
    9 1.8  0.000000
    10 2.0  0.000000
    15 3.0  0.000000
    20 4.0  0.000000
    30 6.0  0.000000
    50 10.0  0.000000
    100 20.0  0.000000
    1000 200.0  0.000000
    10000 2000.0  0.000000
    1000000 200000.0  0.000000

    This table illustrates that the greater the precision, the greater the

    probability of finding the object is high.

    the most general formula defining knowledge (order or disorder) that we have on a system is:

    Information = Number of states / Accuracy

    Entropy is a logarithme_en ( I haven't figured that out yet

    Entropie = Log( Disorder ) = Log(Number of states)

  • Log (Accuracy)

    We can see that with a perfect balance between the capacity of perception and what we give ourselves to perceive, the entropy is null.

    1.4.Entropy of a text

  • do a search on textual entropy (conference of Raphael_Bousso_en The_World_as_a_Hologram_en_nweb_enentropy of a text)

  • define the textual entropy from the statistical entropy Boltzmann factor

  • text-link analogy vs. molecule-chemical link (entropy of a crystal vs gas)

    gas= text without link

    crystal = text with link

    1.5.Neguentropie

    1.5.1.Steric effect

    the steric effect in chemistry is the fact that the spatial crowding of a molecule

    can prevent a predictable chemical reaction by the Pauli exclusion principle

    to perform it because the molecules cannot get close enough to each other

    to react.

    1.5.2.Neguentropy steric effect

    steric effect discriminating against hydrogen and other chemical compounds

    of larger size could be used to separate hydrogen from its compounds.

    A nanoscale device could be used for this.

    The percolation of a chemical compound through such a nanometric structure

    could transform the macroscopic pressure energy into chemical energy

    microscopic. Molecular agitation would be used in the same way as shaking

    which is applied to a sieve to make the finer elements pass

    through the sieve.

    The hydrogen atoms could then recombine by covalent bonding to form hydrogen gas.