entropie et informatique en entropy_and_computing_in_txh_som

Entropy and computing txh

back to the home page

version available in : it ru zh es ja de nl pt fr

date of creation : 20240703- date of update : 20240703- generation date : 20240703_113950

1.Entropy and computing

L'. entropie  (or disorder) measures the ability of a closed system to evolve.

If entropy is low (orderly system) the system can evolve As it evolves, entropy increases and the system is less able to evolve.

Consuming energy actually increases entropy, because energy is never consumed, due to the law of conservation of energy, which is the first principle of thermodynamics..

When energy is dissipated or degraded, its entropy increases (this is the second principle of thermodynamics) .

Entropy is a concept first introduced into macroscopic thermodynamics by Clausius, whose meaning was clarified much later in statistical mechanics by Boltzmann..

The second principle of thermodynamics states that entropy can only increase or, at the limit, remain constant..

Entropy is a physics concept linked to the precise measurement of notions of order and disorder..

Order and disorder are of fundamental importance in physics, which deals with the laws of operation in a physical system composed of a very large number of entities. (a gas and its molecules, for example) .

This physics is called thermodynamics.

Large numbers reveal new properties, new concepts, new realities and experiences..

Entropy was then redefined by Shannon

within the framework of information theory, where entropy is identified with the quantity of information..

Information theory is the basis of computer science.

Entropy and information are one and the same concept.

The more complex a system, the greater its entropy, and the more information is needed to describe it..

For example, the same quantity of matter in gas or crystal form is not described with the same amount of information..

If the crystal is perfect (with no gaps, dislocations, etc..

Shannon nweb) then you only need to specify the position of one atom in the crystal and the structure of the crystal lattice to know where all the atoms in the crystal are located..

We therefore need very little information to describe the system..

In such a system, entropy is very low.

On the other hand, in the case of gas, as there is no bond between the atoms, those-must be described individually if the exact state of the system is to be known.. The amount of information is enormous, linked to the Avogadro 6 number..

022 10^23 and the entropy is very large.

It was then demonstrated that Shannon's definition and that of thermodynamics were equivalent..

To complete the picture, it's worth mentioning that following the work of physicists Bekenstein and Hawking, a new form of entropy has appeared in the dynamics of black holes..

This entropy led to T'hooft's holographic principle.

The smallest unit of physical information is an area the size of the Planck length squared..

Planck's length is the smallest physical length below which the notion of length loses its meaning. (quantum uncertainty) .

In entropy there are two terms that tend to cancel each other out, because entropy is the logarithm of a ratio between two opposing notions.: The observer's ability to perceive and the complexity of the perceived system.

  • The complexity of the system is expressed in terms of the quantity of what there is to perceive, and this is the number of possible states for a system. (equiprobable states) .

  • the ability to perceive is the resolution of our perception, the precision of our observation.

    Its finitude ensures relative indistinguishability between states, ... not too different.

    The greater the number of indistinguishable states, the greater the disorder; we won't be able to "resolve" the actual state, since our capacity for perception is limited..

    The greater our perceptive capacity, the more we can discriminate between possible system states..

    So we'll have more "information" about the system, and we'll consider it more organized..

    The order of the system therefore depends on who observes it.

    Let's look at a simple example: an office.

    The person working on this desk has a given level of information about the status of his or her desk.

    Its office can be modeled by a set of boxes that can be used to locate an object..

    Objects are stored in boxes.

    If there are N boxes and N objects, with one object per box, then there are N! (factorial of N) possible states for the system, each object being able to occupy each of the boxes.

    In fact, to arrange the first object we have N possible boxes, but for the second N-1 boxes etc.

    So the number of choices for arranging the N Objects in the N Boxes is N * N-1 * N-2 * .... * 2 * 1 = N! i.e. the factorial of N.

    Let's consider a particular state of office storage.

    Let's assume that the person knows perfectly well the state of the office, i.e. he knows where to find each object by direct access to the box containing it..

    In this case, we can say that for this person, the office is totally tidy, the order is perfect and therefore the disorder (or entropy) zero.

    Suppose another person has less knowledge of the state of the office.

    This person knows "about" where an object is, but will have to make several attempts to actually find a given object..

    In other words, it will have to expend some energy (degrade energy) and time to "reconstitute" the information it lacks.

    If, in two or three tries, it can find a given object.

    We can say that she doesn't have perfect knowledge of the state of the system, or that for her, the system is slightly disordered..

    For this person, disorder or entropy is not zero.

    Let's assume that a third person is a complete stranger to this office and therefore has no information about its condition..

    To find an object, this person will have to successively open all the boxes until they find the object they are looking for..

    For this person, disorder or entropy is maximum..

    The entropy is then given by the factorial of N, N! which is a good representation of system complexity, i.e. the complexity of understanding the system..

    The factorial is constructed as follows, for the first choice of a box (print run) there are N possibilities, on the second there are N-1 possibility, in the third N-2 etc.

    So the complexity of the system can well be represented by the number N *. (N-1) * (N-2) .... 3 * 2 * 1, which is precisely the factorial of N.

    And again in this example, we assume that the observer is sufficiently intelligent to be able to memorize (capitalize on system experience) and not to search in a box that it has already opened and that did not contain the object it was looking for.

    Otherwise, the complexity of the system would be perceived by the observer as much greater..

    We could say that the complete system : observing system would be in a state of confusion.

    Example of entropy in computer ergonomics: Consider the Windows control panel.

    The-This is marred by significant ergonomic entropy, as icons are traditionally presented in no obvious order on a rectangular surface..

    If you don't know exactly where the icon you're looking for is located, you'll spend (degrade) a certain energy to find her: more eye movement, more gaze time (retinal activity) more ocular thought processes, ultimately more ATP consumption(adenosine triphosphate: the energy molecule) and ionic potential.

    An effort has been made to classify by category, but this is not totally convincing, as it creates an additional level of perception. (away from the detail) and the categories then represent several possible states of the object sought: indistinguishability.

    Alphabetical sorting using a key letter in the name of the object searched for would make it possible to memorize the organization more effectively, and also its durability across all successive versions. (memory makes time less destructive) .

    But this is not available even after more than twenty years of product existence..

    To be exact, entropy is associated with the mathematical logarithm of the factorial.

    S = log (N!Shannon nweb) This is due to the logarithm_in : "The logarithm of a product is the sum of the logarithms of the two factors of the product..

    " In terms of mathematical formulas we have: Log( a * b ) = Log(a) Log(b) If we double the size of an M = N * 2, taking two offices for example, then the entropies are added arithmetically (physical entropy is said to be an extensive property of the system) but complexity increases factorially.

    The logarithm makes the link between the factorial notion of a system's complexity and entropy, which is an extensive, and therefore additive, physical quantity..

    Ehrenfest_urnover_model

    Formula_de_Boltzmann

    In the office example, we can see that as soon as the person starts opening boxes, i.e. interacting with the system, he or she will acquire information about it.-which it can then memorize.

    As a result, the entropy of the system will evolve for her, but she will have had to degrade energy elsewhere to be able to reduce this entropy..

    So it's an entropy shift, because in the global system considered as closed, entropy can only increase. (this is the second law of thermodynamics) .

    In physical systems such as gases, observers, who are beings characterized by a given type of perceptive organs (human beings, for example) do not differ in their perceptive capacity.

    This is what gives the laws of thermodynamics their objectivity..

    This is illustrated by Maxwell's "demon paradox"..

    Maxwell devised the following experiment: two boxes are placed side by side, with communication possible between them.

    Initially, communication is closed and one of the boxes is filled with a gas while the other is empty..

    If we open the communication, molecules will pass through and end up in the other box..

    Statistical thermodynamics tells us that the most likely state for this system is one in which there are roughly the same number of molecules in both compartments..

    This is the state of thermodynamic equilibrium.

    It should be pointed out that the molecules are not subjected to any force that causes them to pass into the second box..

    A molecule can just as easily move from the first box to the second as the other way around, and that's what happens all the time..

    This is why the molecules are evenly distributed between the two compartments..

    If at any given moment there are more molecules in one compartment, then the probability that the molecules will go to the other box also becomes greater, hence the equilibrium..

    Maxwell's demon paradox is the idea that a small, extremely fast demon would have the ability to close communication and would choose to open it only when there are more molecules coming through in one direction than the other..

    This demon would create a dissymmetry in the distribution of molecules and would therefore be able to empty one box to fill the other..

    This demon is able to discriminate the exact state of the system because it operates at the microscopic level of molecules..

    The resolution of such a paradox is that Maxwell's demon cannot exist.

    In the case of thermodynamic physics, this is indeed the case, but it's also possible to imagine different perception capabilities for much less complex systems, as in the example of the office..

    Because of the above, in thermodynamics we neglect the "capacity to perceive information" aspect and focus essentially on the quantity of information..

    But this is just an approximation.

    This approximation is no longer valid if we are dealing with a system that may be perceived differently by human beings..

    And computer systems are a good example of such systems..

    The notion of entropy emerged in the specific context of a physical system that dissipates energy to reach thermodynamic equilibrium, where entropy is at its maximum..

    For example, an enclosure thermally insulated from the outside containing two compartments of equal size, in contact, filled with water where the water temperature is 0° for one compartment and 100° for the other, will naturally evolve by virtue of the second principle of thermodynamics towards a situation where the temperature in both tanks is 50°..

    This last state is stable, and if no external energy is involved, the system will not evolve. (if we consider the system perfectly sterile from a biological point of view, since life could exploit the thermal energy contained in the enclosure to organize matter) .

    In the 1960s, Ilya Prigogine, who was awarded the Nobel Prize in 1977 for this work, became interested in what happens to a system when it is kept permanently far from the thermodynamic equilibrium state by a permanent flow of energy..

    He then observed that the dissipation of energy led to the appearance of order in matter, which he called dissipative structures..

    Dissipative structures are natural structures that possess the property of being self-healing.-organization: "The car-organization is a phenomenon of increasing order, in the opposite direction to the increase in entropy. (or disorder , symbol S) ; at the cost of dissipating energy to maintain this structure.

    It is the tendency of physical processes or living organisms, as well as social systems, to self-organize.-same ; we also talk about auto-assembly.

    Once a critical complexity threshold has been passed, systems can change state, or move from an unstable to a stable phase..

    " The fundamental law of such systems can be summed up in a simple equation d²S=0 established by physics Nobel Prize winner Ilya Prigogine, which means that a system that is self-organizes, creating a minimum of disorder as its complexity increases.

    For a living organism the non-compliance with this law means a return to thermodynamic equilibrium, which for him means death..

    The longevity of an IT system depends essentially on its ability to respond to needs, and therefore to evolve with them.-ci.

    If the system becomes too costly to maintain in relation to the new services it will provide, its existence is challenged and it is soon replaced by a more efficient system..

    The equation is simple: if the cost of maintenance becomes related to the cost of replacement, we choose to replace it.

    In more atomic terms, we can say that if we spend as much energy searching for information as it takes to reconstruct it, then we have lost that information..

    As computer systems grow, if they don't respect the law of minimum entropy creation (to simplify redundancy) as their complexity increases, they become more difficult to maintain and eventually disappear, which is the equivalent of returning to thermodynamic equilibrium. (the dissipative structure disappears) .

    2.Computer entropy

    Computer systems are an example of such structures : they are dissipative, since energy (and therefore a financial cost) is used to establish and maintain them.

    If we look at the evolution of computer science, we can see that the permanent trend is to reduce redundancy..

    Developments in the IT sector can be cited as examples.:

    2.1.Reducing redundancy in IT

  • control and data structure in L3G (third-generation languages)
  • reusable function module libraries
  • object languages: inheritance, genericity, abstraction.

  • namespaces, isolation to organize redundancy (contexte) .

  • evolution of hierarchical database management systems, network to the Codd model of relational databases and more recent developments in object databases.

  • less fundamental but very useful: preprocessors (macro-definition, macro-fonction, include) , L4G (specialized language for high-level objects, made less useful by the object approach) , application settings (which is a form of genericity) .

  • hypertext is in itself a means of reducing redundancy in a text, because by enriching the text with links reduces the tendency to repetition.

    2.2.Example of ergonomic entropy in computing

    Example of entropy in computer ergonomics: Consider the Windows control panel.

    The-This is marred by significant ergonomic entropy, as icons are traditionally presented in no particular order on a rectangular surface..

    If you don't know exactly where the icon you're looking for is located, you'll spend (degrade) a certain energy to find her: more eye movement, more gaze time (retinal activity) more ocular thought processes, ultimately more ATP consumption (Adenosine triphosphate) and ionic potential.

    An effort has been made to classify by category, but this is not conclusive as it creates an additional level of perception. (away from the detail) and the categories then represent several possible states of the object sought: indistinguishability.

    There is an alphabetical sort on the first letter of the icon's label, but the first letter is not sorted.-This is not necessarily representative of the function we're looking for, and may be redundant. (for example "Centre de ...") .

    Access to the Windows Control Panel also changes-depending on the version ! If you change versions or work with several versions (your personal computer, the Windows servers you connect to ...example of entropy in computer ergonomics) it's inherently disruptive: confusion and wasted time.

  • The longevity of an IT system depends essentially on its ability to respond to needs, and therefore to evolve with them.-ci.

  • If the system becomes too costly to maintain in relation to the new services it will provide, its existence is challenged and it is soon replaced by a more efficient system..

  • the equation is simple: if the maintenance cost becomes related to the replacement cost, we choose to replace it.

  • in a more atomic way, we can say that if we spend as much energy searching for information as it takes to reconstruct it, then we've lost that information..

  • when computer systems grow, if they don't respect the law of minimum entropy creation (to simplify redundancy) as their complexity increases, they become more difficult to maintain and eventually disappear, which is the equivalent of returning to thermodynamic equilibrium. (the dissipative structure disappears) .