Purple Hearts
Main menu
Home
English
Scopo del sito
Mappa del sito
ID in pillole
Links
Ricerca
Libri stranieri
Libri italiani
Documenti
Contatto
Administrator
Login
Username

Password

Remember me
Forgotten your password?

RSS

Thermodynamics disproves Darwin
Staff  

Introduction

Thermodynamics is the field of physics that studies the laws that govern the conversion of energy from one form to another, and the availability of energy to do work. Moreover statistical thermodynamics studies the issues above when systems are composed of such many elements that can be modeled by means of statistical methods only. Biological evolutionary theory is a set of hypothesis that tries to explain origin of life and origin of species by means of natural explanations only. What thermodynamics matters with biological evolutionary theory? How entropy functions in physical systems, and in biological systems? What relations exist between thermodynamics and information theory? Is entropy a concept regarding physicality only or also information? Here we will try to draw some elementary considerations about these issues in general. Evolutionary theory has two major claims. First it supports the hypothesis of abiogenesis (origin of life from matter alone); secondly it supports the hypothesis of Darwinian macroevolution (transformation of all species from a common ancestry), via random mutation and natural selection. We will try to explain simply how laws of thermodynamics and entropy deny these major claims of evolutionary theory.

Thermodynamic entropy

According to thermodynamics, in a closed system, any real physical process can't finish with as much useful energy as it had to start with, some is always wasted as heat. Whether we consider heat as a form of energy, then the first law of thermodynamics, called " law of conservation of energy ", affirms that in a closed system energy remains constant, through all the transformations it may involve.  The second law of thermodynamics (SLOT), called " principle of thermodynamic entropy ", or " principle of growing entropy ", states that in a closed system in average entropy can never decreases (it increases or remains constant).  In practice entropy, which is directly proportional to disorder degree, always increases globally at all the levels of organization of the universe. 

It might seem strange but the prehistory of the discovery of entropic principles is related to probability theory.  This branch of mathematics was discovered studying gambling games. A.De Moivre, trying to characterize a gambling game, introduced in probability theory the concept of " medium uncertainty " of a result, as the sum of the weighed uncertainties of the single results.  The uncertainty of a result is defined as the natural logarithm of the inverse of the probability of the result to happen.

In analogous way the physicist L.Boltzmann in thermodynamics defined the entropy " S " of a statistical system (for example a gas) as S = k * log (W); where " k " is a multiplicative constant (named Boltzmann constant) and " W " is the number of possible configurations of the state itself.  Stated that " W " is the inverse of the probability (1/P) of the state to occur, the Boltzmann formula is quite similar to De Moivre's one neglecting the multiplicative constant " k ".  The formula is logarithmic because probabilities are multiplied while entropies are added.  Hence entropy is a measure defining the probability of states of a gas to occur. In general entropy is a measure defining the probability of a certain state of a system whose number of elements is such huge to be manageable by means of statistical methods only. Using the terminology of probability theory, entropy is related to the uncertainty of a result or state. SLOT asserts that a gas (or in general any complex physical system) goes always towards its more probable states, i.e. towards entropy increasing.

 

Consider two tanks, say A and B, containing a gas at two different temperatures.  In these conditions the system has a certain degree of order, in fact it has the potentiality to produce work, thanks to the thermal difference. Information is at maximum (two temperatures).  If we render the two tanks communicating the temperatures balance to a medium value because low molecules merge with the fast ones; the disorder degree increases; it is impossible to gain work; information decreases (one single temperature).

At first the physicists have found difficult to conciliate the deterministic, reversible viewpoint and the probabilistic, irreversible scenarios.  In fact it was since the discovery of thermodynamics laws that the simple worldview of Newtonian physics, characterized by the deterministic and reversible dynamics of stable systems, went in crisis, until leading to the ascertainment that: «Also at the macroscopic level our prediction of the future merges determinism and probability together»[1]
 

We try, if it is possible, to explain easily this transition of scientific conceptions, or better, co-existence of conceptions, resuming the example of the two containers of gas.  For first we assume, as a limit case, that the two containers contain only one molecule, characterized by an average speed.  This molecule at the beginning stays in the container A, but since it moves it can pass to the container B. Anyway nothing prevents that, after some time, it returns to A, and so on many times.  At each point there is the same probability the molecule stays in A or in B. Its deterministic trajectory is calculable by means of the laws of Newtonian dynamics, and is reversible.  Now let's imagine instead that the tank A contains a very huge number of molecules with almost the same speed, and tank B contains no molecules. After some time the molecules will stay half in A and half in B. Their trajectories are not (humanly) calculable with classic dynamics, their state is describable only statistically and is irreversible. It is irreversible because it is almost impossible that all the molecules will return to the same unique tank. We have passed from a deterministic description to a statistics description, which SLOT applies to, the law of irreversibility, the law of the so-called "arrow of time ".  Let's note it is not the phenomenon itself is not deterministic.  The indeterminism is appearing only: simply now we are no more able to calculate effectively all the innumerable trajectories and we must be satisfied with a probabilistic scenario. 

Instead the irreversibility is real because it doesn't depend on our limitations, it's based on reality of things.  From what has it arisen?  In order to know it we must ask what is changed.  Only the number of molecules has changed: it is passed from one to a huge number.  Here is proved that SLOT - the entropy law - is a direct consequence of a more general cosmological principle:  a cosmic process is irreversible because it goes from unity to multiplicity.

 

Every process with growing entropy is irreversible.  The consequences are several:  the heat passes from a warm body to a cold body and not vice-versa; the energy converts from more valuable forms to less valuable ones and not vice-versa; order degrades in disorder irreversibly; every things usury; information becomes corrupted, systems degenerate, and so on.  We can say intuitively that ordered states are few while disordered states are many. In the example above of two communicating tanks, there is only one state when all molecules are in a certain tank. Instead there are many different states when molecules stay half in a tank and half in the other one. Hence, disorder being more likely, the tendency is unavoidably towards it, rather than towards order.  In mechanical machines of whichever type, i.e. systems where there are parts in motion, this creates heat and usury, both irreversible.  There are a lot of examples in every day life of drift towards disorder.  The example more banal is the nut that is unscrewed from its bolt by vibrations: once the nut is unscrewed it does not return by itself to its place but human intervention is necessary.

Maxwell's daemon

J.C.Maxwell, the discoverer of electromagnetism equations, to illustrate the concept of entropy and its implications, used the following famous ideal example. Let's imagine again two communicating tanks (A and B) containing a gas at the same temperature and pressure. Moreover we imagine that the wall placed between the two tanks is equipped with a openable small door managed by a " demon " whose function is this:  when he sees an high speed molecule arriving from A then he let it pass into B opening the gate; instead if a low speed molecule is arriving from A then he stops it, closing the gate.  Obviously in the long run tank B will contain high kinetic energy molecules only (and therefore will be warmer) then tank A, which will contain low temperature molecules only (and therefore will be colder).  Such temperature difference might be used to produce work.  Hence does the Maxwell's demon deny SLOT and is able to decrease entropy? 

The answer is sure " it doesn't ". For this reason:  it easy to see that such demon is an ideal device that carries out a very complex job.  To realize in practice such device it would mean to design a system that consumes energy and processes information.  In fact it would be necessary to introduce in the middle of the two tanks a complex computerized system.  If we would calculate the algebraic sum of input and output energy and the system considered in its totality (tanks + device) we would see that first and second thermodynamics laws are not denied and that entropy does not decrease at all. 

This ideal example has been worth showing because it allows us to introduce the information concept in the thermodynamics issues.  For this reason we have cited it.  The main function of this device should be to get and use information about the speeds of molecules.  Only this way it would be possible to counterbalance entropy in the thanks system.  The Maxwell's demon teaches us that only by elaborating information it would be possible "to organize" a statistical system as a gas is. Therefore the " moral " of the Maxwell's demon is very important because it links thermodynamics with information theory. As W.Dembski rightly says: «It is CSI that enables Maxwell's demon to outsmart a thermodynamic system tending toward thermal equilibrium»[2]

That can be generalized tout court:  only information can organize whichever system.  Therefore one understands how deep relations do exist among all these concepts.  In fact both thermodynamic entropy and information are related to the probabilities of the states that a statistical system (or in general a system composed of many elements) can assume. Probability stays at the very base of both concepts and therefore this ascertainment directly leads us to deal with information as related to entropy.

Information and entropy

It is useful now to make a digression in order to outline the existing relations between physical entropy and information. Intuitively information is what decreases uncertainty. Remember that entropy is related to growing uncertainty. Lack of information corresponds to disorder. Entropy expresses the disorder of a system. Thus entropy can measure lack of information. Mathematically the digital information content I in bits of a pattern (or event, or state, or sequence of symbols) is defined as I = log (1/P) (where P is the probability of the pattern and "log" is the logarithm in base 2) (C.Shannon[3]). It is evident that this formula is similar to the formula of thermodynamic entropy, for both concepts are related to the probability of states in large event spaces. L.Brillouin expresses the relation between information and entropy by the formula: ΔS ≥ ΔS -k * ln 2 * ΔI  ≥ 0, which is a sort of generalized SLOT[4]. According to Algorithmic Information Theory (AIT) the algorithmic randomness K(d) of a string of data ‘d' is the size of the most concise description of ‘d' (or the minimal algorithm generating ‘d'). So thermodynamics meets AIT in the following definition:

 

«Physical entropy S(d) = H(d) + K(d) is a sum of two contributions: (i) The algorithmic randomness K(d), and (ii) the information about the actual microstate which is still missing, in spite of the availability of [the already available relevant data about the system] ‘d', as measured by the Shannon conditional entropy H(d).»[5]

 

Some interesting consequences derive from this relation between information entropy and physical entropy. 

 
If we want to diminish entropy in a system we must input information.  Every time the system loses information the process is irreversible. 
«The entropy can be interpreted as lack of information into the considered system [...] Entropy is considered in a generalized manner as the expression of the disorder of a physical system. More precisely we can say that entropy measure the information lack on the effective structures of the system. [...] Information lack corresponds to a real disorder [...] entropy is a measure of the lack of information [...] more information, less the entropy will be [...] it is then possible to decrease entropy in a system increasing information»[6].

The concept of information and its entropy is closely related to the degree of organization in a system. As N.Wiener said: «The amount of information in a system is a measure of its organization degree»[7]. The increase of entropy in a system always causes disorganization and degradation of its operations.  Moreover being the biological systems not-linear adaptive complex dynamic systems the relation that ties the delta of entropy with its effects is not linear.  In other words it happens the same thing as in mathematics and informatics where a small error in the initial information may cause enormous damages in the effects.

According to SLOT, in a system energy degrades in irreversible way, and, from what we said above, for the "law of information entropy ", information and organization have the same destiny and degrade.  Only by inputting information into a system from outside it is possible to contrast entropy and disorder for a while.  For this reason in data transmission theory engineers study methods for contrasting the degradation of digital and analog signals on the transmission carriers.  Because of such degradation (of which entropy is a measure), caused by several factors, isn't granted that information arrives correctly to the receiver. Ad hoc algorithms and coding techniques have been studied in order to increase redundancy of information, i.e. to contrast entropy, and then to make information arrives intact to destination.  Without these algorithms, which continuously regenerate signals, telecommunications would be impossible. 

Entropy in biology

We have seen before as entropy functions in science and technology. Now what about entropy in the biological reign? Here we face a very strange situation. About the origin-of-life problem, evolutionism, asserting that life arose spontaneously (abiogenesis), practically says that, biological information, organization and order arose from nothingness. That contradicts entropy. Moreover Neo-Darwinism pretends species evolves continually from less complex forms to more complex forms. Again that contradicts entropy. In other words evolutionism states information and organization arose from zero and were growing even unguided! This wrong worldview, where one pretends that information and complexity are fully gratis, goes inexorably against thermodynamics principles under its physical and information aspects.  This is as to say that heat passes from a cold body to a warm body spontaneously, or that removing information from a system increases its organization, or that putting a sound apple in a basket of rotten apples, causes all rotten apples to become sound.  Laws of entropy establish beyond the shadow of a doubt that all in nature, and since the beginning, goes from order to disorder, if intelligence is not involved.  Instead evolutionism affirms order arose from disorder and will continue to grow, indeed the order at its maximum degree, the colossal organization of life.  Its pretensions are in absolute contradiction with these laws. The defenses evolutionists try against principles of entropy, lethal for their theory, are inevitably groundless.  In the next we will examine three of their "defensive reasoning" (read "errors") for escaping the inexorable entropic tendencies rebutting their hypotheses. We will give suitable counterchecks to them. 

First error: energy can produce information. It cannot.

As first attempt of defense (to the anti-evolution critics based on the principles of entropy) Darwinians note (for example Gell-Mann[8]) that organisms receive continuously energy from sun, which produces order and information. 

Here is confusion between the carrier or support of information and information itself, which are two very different things.  For example:  a telegraphic line can only work thanks to the electric power but the telegrapher must introduce the message to be transmitted over the line.  The telegrapher introduces the message and the electric power carries out the function of its transport. 

In other words, we must distinguish between the concept of energetic order and the concept of information order.  We have seen that, for thermodynamic entropy, energy degrades irreversibly in quality, converting from more ordered forms to less ordered forms.  Heat is the most degraded energy since molecular agitation is the more chaotic thing one can imagine.  An electrical current flowing in a conductor is more valuable form of energy:  it consists of electrons going all in the same direction.  But this physical order that we have called "energetic order" is different respect information order.  Information order is always, in similar cases, if we analyze it, referable to a rigorous order of bits.  Think about the above flow of telegraphic information:  the message written by the telegrapher is expressible as a sequence of bits traveling as a train on the telegraphic line. Here is an example at the macroscopic level: illuminating and heating a room is not sufficient to put it in order or organize it, the room always remains disordered if we do not tidy up.

 

The problem may be considered also another way.  Enduring of life in all the living organisms demands an unimaginable amount of feed backed, metabolic, constructive processes. Day and night these processes of elimination, transformation, repair, duplication, control, maintenance and so on conserve organisms alive.  All these vital homeostatic processes, thanks to which, in a word, order and equilibrium are maintained at all levels of organization in organisms, waste energy.  Only from this point of view it can be said that order needs energy. Without energy all these processes stop.  But the energy alone is not at all sufficient to create and maintain biological order.  Energy feeds only the numerous biological mechanisms that guarantee organization.  Without the intelligent and incessant work of these biological mechanisms, which contrast degeneration, life could not subsist for a minute, also with energy.  To think solar energy alone can justify "in toto" the huge organization of life is absurd: 

«An engine can work spontaneously; but it is not self-constructed spontaneously.  Let's take the pieces that it is composed of (and still worse, let's take the simple materials it is formed of); if we leave them abandoned, even if in presence of any form of energy like that of the beams of sun, it is not possible that they spontaneously order themselves to form the engine»[9]

In order to confute this example of the engine evolutionists cannot say that a motor is not made of biological matter because they affirm that organic matter is exactly equal to inorganic matter. To sum up the illusion that energy is sufficient to create organization and information is due to the misunderstanding of the differences between energy and matter from a hand and information from the other hand.

 

Against our objections Darwinians might cite the Urey-Miller's experiments.  S.L.Miller and H.C. Urey in 1953, to prove experimentally that origin of life started from inorganic chemical substances (abiogenesis), tried to simulate the so-called " primordial soup ".  They placed within an alembic some methane, ammonia, vapor water, hydrogen and supplied energy to the system by means of electrical strokes.  After a week they made the chemical analysis of the compound and found some aminoacids had arisen spontaneously.  Many other experiments were made to obtain proteins spontaneously. But proteins are much more complex than aminoacids and the experiments did not succeed: 

«Scientists asked: " how could life evolve from inorganic matter?  And which is the probability of a similar event". With their great displeasure, these questions do not have precise answers, since biophysicists did never succeed to reproduce the experiments of nature about the creation of life starting from not living matter»[10]

Matter and energy alone cannot provide information and order necessary to even the simplest unicellular organism.  The amino acids that have been produced in the Miller's experiment are the normal amino acids we can expect from the chemical reactions possible starting from the chemical compounds into the bulb. But the highly complex macromolecules of life (proteins, RNA, DNA, etc) contain a lot of information and cannot arise without it. 

Moreover there is an ulterior difficulty to account for: the molecules of aminoacids can exist in nature in two symmetrical types of configuration:  levorotatory (light left-polarized) or dextrorotatory (light right-polarized).  In experiments of amino acids synthesis as the Urey-Miller's one statistically the fifty per cent of the ones and the fifty per cent of the others arise. That's right. For inexplicable or unknown reasons, in organisms all amino acids, therefore all proteins and nucleic acids, are levorotatory.

Someone has observed:

«But how millions of DNA nucleotides might be all, without exception, asymmetric by the same side, if their choice has been done by chance between materials half levorotatory and half dextrorotatory?  We would believe to randomness if a roulette drawn for centuries red numbers and never black numbers?»[11]
Just this ascertainment alone should be sufficient to refute any attempt to validate the hypothesis of abiogenesis.
Second error: evolution can increase information. It cannot.

Secondly evolutionists remember that such principles function in closed systems only and organisms are living in an environment that supplies information to them, by means of adaptation and natural selection. 

But for sure in the adaptive and selective phenomena there is not even an infinitesimal of the necessary injection of information for obtaining the miraculous transformations evolutionists have in mind.  Simply the environment helps limited improvements and strengthening of possibilities already present potentially in organisms and pre-designed in their projects.  Therefore there is an abysmal difference between a species adapting to a particular environment and an organism that, as they affirm, evolves, by means of unthinkable metamorphosis, towards a completely different organism, even belonging to another order or class. 

Instead evolutionists, trying to explain the evolution from reptile to bird, enunciate something similar: 

«Even when some organs arise to perform new functions, as for example the wings in order to fly, they are developed in base to the same principle, for " transformation ", so that the wings have arisen from the front legs of reptiles»[12]

The evolutionistic explanations of the conversion of fishes in amphibians and then in reptiles are analogous.  The fishes were bored of humidity and wanted to finally stay on dry ground. So they got limbs: 

«The rise of tetrapod vertebrates and their wonderful expansion, followed by amphibians, reptiles, birds and mammals, drew its origin from the fact that a primitive fish choose to explore the earth, on which, being incapable to move, they hop in a inexpert manner and create then, as consequence of a behavioral modification, the selective pressure thanks to which the sturdy limbs of tetrapods would develop»[13]

Evolutionism supposes the inverse direction too, from earth to sea:  cetaceans would be quadruped terrestrial mammals tired of staying in mainland.  This strange evolutional bi-directionality suggests that is highly probable all animals always quietly remained in their natural habitat where they were designed to stay.

The derivation of man from the anthropomorphic apes has been less spectacular and disappointing in comparison with that evolution. It has been sufficient that apes learned to walk in upright posture: 

«If some primates did not have acquired the upright posture, our genealogy probably would be finished in a group of anthropoid apes»[14]
 
Information for macroevolution does not exist in nature, or better, it cannot be given by Darwinian evolution.  The environment selects individuals and rewards the fittest but it does not give sufficient information for the huge transformations involved in the biological novelties of species.  Evolutionists have asked whether in their evolutionary process (mutations + selection) information increases.  In which of the two factors might the information generator be found?  Remember information is what reduces uncertainty.  R.Dawkins, an evolution authority, answers this question: 
«
Mutation is not an increment in the true informative content, but it is rather the contrary, since mutation, in the analogy of Shannon, contributes to increase the previous uncertainty. But now let's consider the natural selection, which reduces the previous uncertainty and then, supplies an informative contribution to the genetic pool [...] If the natural selection supplies information to the genic pool, what sort of information is? It is information about surviving»[15]

Hence mutations do not provide information and natural selection supplies only information for survival. But to supply the modest information to survive to a reptile is the same thing of providing the immense CSI (Complex Specified Information) necessary " to transform it " in ... a bird?  Sure it is not.  Analogously to give alms to a beggar, perhaps will allow him to survive today, but it does not transform him miraculously in a ... king.  The same way that alms are not enough to avoid many people die of hunger, information " on surviving " has not been enough to avoid that the 99% of animal species died out in the course of geological era instead of evolving. 

Darwinians emphasize that, of its two parts, mutations and selection, mutations are random but selection is an algorithm, i.e. a law.  If there were only random mutations there would be nothing to pretend:  chance is impotent to design.  To say, as Dawkins said above, that mutations do not create information is practically the same thing.  But it is due to natural selection if evolution would work.  Mutations are only the " engine " that produces the variations in input to the selection algorithm, that acts as a post-processor.  For first we must note that this engine is inefficient (in order to stimulate macroevolution), since the variations it generates can help at the most microevolution. Selection is only an algorithm whose purpose is simply to optimize the fitness function. As said above selective law does not have other purpose that to help survival.  A mechanism that is designed for this minimum purpose cannot perform, for the causality principle, a result of a completely different order, a result qualitatively incomparable with the first one, macroevolution.  It is not enough to say that large-scale animal evolution would be obtained by means of gradualism, i.e. many small successive increments.  Natural selection can operate only limited transformations in organisms (microevolution of species).

Third error: neghentropy events cause evolution. They don't.

The third attempt to refute entropy laws that Darwinians try is that of - for example -J.Monod[16], who underlined how entropy is a statistical concept. That's to say that, meanwhile entropy increases globally, entropy may decrease locally. As a consequence it might happen in the space-time that in rare cases entropy diminishes instead of increasing.  It would be in these rare instants evolution operates by natural selection.  Physicists give the example of a brick: if in a particular moment entropy diminished to the point that all molecules were high ward directed, the brick would be raised in the air.  The take off of a brick is not mathematically impossible, i.e. the probability of the event is extremely low, but not exactly zero. In other words evolutionists hope in these ultra rare negative spikes of entropy. But to base entire evolution on the rarity of these phenomena it's really too much optimistic:  much more optimistic than a man not able to swim, who plunge into the sea from a ship hoping that a sufficiently powerful anomalous wave lifted up him to the deck again. 

An example of effective calculation of a similar phenomenon of " neghentropy " - or negative entropy -, in biology, is given by G.Nicolis and I.Prigogine: 
«Consider the process of formation of a biopolymer, say a protein with 100 amino acid molecules in length (ρ = 100). It is well known that in nature there exist N = 20 kinds of amino acids. If biopolymerization were to take place in an isolated system in which all sequences are a priori equally probable we would have N = exp(100 ln 20) ~ exp(300) equally probable sequences. Hence, any particular arrangement - necessary to fulfill, for instance, a desired biological function - would occur with the exceedingly low probability of exp(-300)! If, on the other hand, biopolymerization is performed under nonequilibrium conditions corresponding to an entropy of, say, I = 0.1 * Imax, then only n ~ exp(30) sequences of length 100 would be realized with an appreciable probability. This number is much closer to our scale. It is thus conceivable that evolution acting on such a relatively restricted population in a nonequilibrium environment can produce, given enough time, entities endowed with special properties such as self replication, efficient energy transduction, and so on»[17].

We even admit that a probability of one on 10,686 billions can be considered feasible.  Moreover we accept the entropy-becoming-one-tenth assumption (I = 0.1 * Imax), also if it seems biased. But, also admitting this, it would be only a random formation of one protein of the cell. From here to arrive to " entities endowed with special properties such as self replication, efficient energy transduction, and so on ", i.e. to complete cells, there is a long way to go!  We remember, for instance, that an Escherichia Coli cell, the common intestinal bacterium, without considering the water, contains an average of 332 millions molecules, of which 1 million are protein molecules (of 3,000 different types)[18].  Has the bacterium, for having only its protein equipment (we do not consider the rest), to wait for 1 million miraculous neghentropy events?  It hasn't - the cytology specialist answers - because inside cell there is a proteins factory.  But so the problem is even more enormously complicated:  from the spontaneous formation of a protein we pass to the spontaneous formation of a proteins factory (able of producing a million proteins of 3,000 different types!).  Of how many orders of magnitude is the problem complexity increased?  You understand that we are running a scale of absurdity more and more arduous.  That always happens when evolutionism tries to explain the accidental rise of the first cell. 

The arrow of time
 

I.Prigogine wrote: "The more original contribution of thermodynamics is the famous 2nd principle, which introduces in physics the arrow of time. [...] Probability explains entropy and entropy, in turn, introduces the arrow of time"[19]. Thermodynamics shows that in a system entropy increases in total, i.e. in every process entropy trend is growing. In every process time is irreversible. So, the asymmetry of time is related to the asymmetry of entropy. For this reasons no process can go backwards in time. According to thermodynamics an unguided process is a sort of passage from order to disorder. In a system entropy grows because the disordered states are much more numerous than ordered ones. The greater number of disordered states increases their probability to happen. The lesser number of ordered states decreases their probability to happen. Since systems go always towards their more probable states (i.e. the disordered ones) entropy grows. More disorder more entropy. SLOT, considering the universe as a system, states the universe goes towards its "thermal death", its more probable and more disordered state (that of maximum entropy). In this sense the universe is a sort of "descent" from order to disorder. The usual objection to this is: but the arise of civilities, science and technology seems to prove the contrary. This objection forgets that any manifestation of order, knowledge and organization needs intelligence. Thermodynamics deals with processes without intelligence agency.

 

In the cosmos empty space doesn't exist for space always contains bodies; empty time doesn't exist for time always contains events. Cosmic events mean physical processes. Physical processes entail growing entropy. Entropy growing involves irreversibility. Since time doesn't exist without events (and entropy and irreversibility) time is strictly related to entropy and time itself is irreversible. Example: body is moved from point A (at time 1) to point B (time2). Then it returns from B (time3) to A (time4). What's happened? Now the body is again in A but the situation at time4 is not at all the identical situation of time1. Situation at time4 only apparently is equal to situation at time1. In general in the cosmos situations can seem similar but never repeat identical. In fact in the process A->B->A entropy has grown and time has past. All that has a name: irreversibility. Asymmetry of time is related to entropy and all other asymmetries of the cosmos.

 

Thermodynamics states that in an unguided process order decreases and disorder increases. We might ask: how does thermodynamics fit with the two cosmological principles called "quality" and "quantity"? The link to the quality/quantity binomial is based on the correspondence between quality => order and quantity => disorder. To justify that simply, for example, consider a building. An architect designs a building and carpenters put many materials together in order to obtain a construction according to his project. At this point we have the maximum order (quality is at a maximum and quantity at a minimum - the building can be considered an unique structured and organized large system). We know that - without a continue maintenance - the destroying action of many environmental factors and forces during a large period of time will disassembly the building obtaining a set of many scattered unstructured and disorganized pieces. Now we have the maximum disorder (quality is at a minimum and quantity at a maximum - there are only many little pieces all around). Someone might object the building too is made of many pieces. But that objection comes from a wrong reductionist-positivist point of view. From a deeper viewpoint the building must be considered as a whole. The universe is analogous to that building and thermodynamics tells us entropy works the same way in both cases. We could say that in the cosmos the role of quality is ordering quantity (information orders matter/energy). Of course it's the role also of a designer respect his projects.

 

None ever saw spontaneous origin of life or macroevolution phenomena for the simple reason spontaneous origin of life and macroevolution phenomena never happened. They never happened for the simple reason they need information and energy. But information and energy are subordinate to the implacable laws of thermodynamics. Entropy doesn't give a discount to anybody. To sum up it's a very paradox that evolutionism, which pretends to explain origin and development of life during time, contradicts thermodynamics, the science of the arrow of time.

Addendum - Objections
 

As a complementary reading, we add here some interesting objections and counter-objections usually debated during discussions about the thermodynamics vs. evolution issue.

Objection #1

There is good philosophical evidence (outside the purview of positivist physics) that the asymmetry of time does not depend upon entropy.

Answer #1

According to traditional cosmology manifestation is a deployment of possibilities running from an Essential (qualitative) pole to a Substantial (quantitative) pole. In other words the universe is a process going downward (a "descent"). Thermodynamics fully agrees saying that in a system entropy increases in total, namely in every process entropy is irreversibly growing. For this reason a process cannot go backwards. As a consequence time is irreversible. So, asymmetry of time is based on asymmetry of entropy. Asymmetry of entropy is based on the asymmetry of the two fundamental cosmological principles (Essence and Substance). The superior rank of Essence respect Substance agrees with an Intelligent Design explanation of the origin of the cosmos. The descent of manifestation from Quality to Quantity is a sort of passage from order to disorder. According to what we just said before, considering the universe as a system, SLOT states the universe goes towards the "thermic death", its more probable and more disordered state (that of maximum entropy).

Objection #2

Nothing shows that there is a descent from "Quality to Quantity. Quality doesn't become quantity.

Answer #2

To say that "manifestation is a descent from Quality to Quantity" doesn't mean at all that quality becomes quantity. That's impossible. In fact whether quality and quantity were convertible the unique Designer of the universe would not need two principles to form the cosmos. We could have the cosmos produced directly from quantity alone. Instead the Designer used two principles (as two tools). In the terms of modern science and technology whether matter/energy (quantity) and information (quality) were convertible then engineers would not need both to construct systems. In the cosmic descent quality decreases and quantities increases, but quality doesn't become quantity. Think of a glass full with half of water.  Whether we drink the water, the glass becomes full of air but obviously water has not become air. The concepts of quality/quantity and "descent" of traditional cosmology agree with the laws of thermodynamics (entropy, irreversibility...). Thermodynamics states in a process order decreases and disorder increases. The link to the quality/quantity binomial is based on the correspondence between quality => order and quantity => disorder.

Objection #3

But perhaps thermodynamically principles are not all embracing. Perhaps there are other principles, principles of living things, which account for a movement towards order.

Answer #3

It's an inveterate error of Darwinists to say that there are exceptions of thermodynamics in biology permitting evolution. First, at the very point in which they ask for thermodynamic exceptions, they ipso facto implicitly admit thermodynamics is against them. Second, conceptually, exceptions to natural laws are impossible because they would mean exceptions to causality (specifically secondary causes). Third, in practice, all people see that organisms consume, transform and exchange energy as all other systems. Moreover organisms degrade (more slowly than artificial systems do but degrade) and finally at the end die, as all other systems. By the way that slowness depends on the wonderful property of self-repairing, which is based on countless intelligently designed homeostatic servo-mechanisms organisms have got. When Darwinists speak of "principles of living things, which account for an evolution towards order" are speaking of phantoms. They are exactly those thermodynamic exceptions above, which cannot exist. " Evolution towards order" can be driven only by intelligence. By the way, any person uses daily continually his intelligence against thermodynamic entropy, when he intelligently thinks, writes, calculates, organizes, manages, orders, structures, programs, squares and designs things. Without intelligence thermodynamics takes the wheel and things go towards disorder only.

Objection #4

It seems intelligence is an exception to thermodynamics. But if you admit one exception, why block others?

Answer #4

To say that "intelligence is an exception to thermodynamics" is misleading. Intelligence is a source of information. Increase of information entails decrease of entropy. In this sense intelligence opposes and contrasts entropy. But we cannot say properly that intelligence interrupts thermodynamics laws. Simply intelligence is a component of the scenario. Thermodynamics laws remain perfectly functioning also when intelligence is acting. The same way in electronics a current generator doesn't interrupts electronics laws. Simply the current generator provides current in an electronic circuit lacking of electric current. The same way intelligence provides information in a system lacking of organization.

Objection #5

Thermodynamics is nowadays a statistical theory. As such, its predictions are merely probable, not absolutely certain.

Answer #5

The fact that thermodynamics is a statistical theory doesn't imply that its theorems aren't absolutely certain. In mathematics probability theory is full of absolutely certain theorems dealing with predictions.

Objection #6

It is possible for entropy to decrease, for instance, but it is very unlikely.

Answer #6

What matters is the global trend of entropy. The global trend of entropy is to increase. That refutes neo-Darwinism. Thermodynamics affirms that in any system entropy (read disorder) spontaneously increases; since the physical morphing of one simpler species to another more complex species would imply the decrease of entropy, such physical transformation cannot happen spontaneously.

Objection #7

Thermodynamics isn't proved mathematically, the evidence for it is fallible empirical evidence, like that for every other scientific theory.

Answer #7

The sub-field of thermodynamics dealing with statistical mechanics of systems endowed with a huge number of elements is proved mathematically and isn't based on fallible empirical evidence. Such field proves that order spontaneously decreases. Mathematics is an example of scientific theory not based on fallible empirical evidence. So it's false that every scientific theory is based on fallible empirical evidence.

Objection #8

It is only a fallibly known empirical fact that there is a huge number of elements and that they obey laws at all like the ones we think they obey. Given these empirical facts, the rest may follow.

Answer #8

To deny a "huge number of elements" is to deny multiplicity. To deny multiplicity is to deny manifestation, i.e. the cosmos itself. You can't deny the cosmos itself. If you deny the cosmos itself you yourself deny in the same time neo-Darwinism (which belongs to the cosmos). The laws considered here are direct consequences of multiplicity. To refute them is to refute multiplicity and that's impossible (as just said).

Objection #9

Ilya Prigogine and others have shown that order can be increased in open systems through which matter and energy flow. The thermodynamic argument against Darwinism is thus seriously weakened.

Answer #9

Many mathematicians and physicists disagree (as for example the mathematician Granville Sewell; for example read his article here:  http://www.spectator.org/dsp_article.asp?art_id=9128

where he deals with close and open systems). Their conclusions are that the works of I.Prigogine don't refute at all the thermodynamic argument against evolutionism. If energy is able to create new biological order and Earth receives substantial energy input from the sun (as Darwinists say) so why today don't we see the arise of wonderful new species? It is true that Prigogine has shown that something he calls "order" can arise in open systems, but that claim is essentially trivial. Prigogine's approach (to open system, far-from-equilibrium thermodynamics) is elegant but it does not show very much that is relevant to the whole origin of life problem or evolution generally (despite some hype to the contrary). The "order" he refers to is not the kind of thing that leads to life. Nobody has shown how that can happen. It's the basic information/CSI problem at work in a different context (as in so many issues surrounding origins).

Objection #10

The laws of thermodynamics would be inapplicable if each particle's path in matter or molecule path in a gas were carefully planned (by an angel, for instance). 

Answer #10

But neo-Darwinism denies any "guided" evolutionary processes (whoever is the "driver"). So thermodynamics applies and refutes neo-Darwinism.

Objection #11

We can easily come up with a half a dozen sets of laws of nature under which entropy does not increase.

Option 1. Particles never interact, but all move together in the same direction. No change in entropy at all!

Option 2. Basic particles are Newtonian, but perfectly sticky. Energy is not conserved. In a finite box, sooner or later all particles will stick together. I doubt entropy increases - in fact, it may decrease.

Answer #11

Option 1 and 2 aren't at all "laws of nature" but simple biased examples or absurd cases. In option 1 WHO or WHAT makes particles all move together in the same direction (not a easy task)? In option 1 WHO or WHAT ... smears glue on the particles? We know that in nature things don't work so.

Objection #12

Energy input can indeed compensate for entropy reduction.

Answer #12

Energy input cannot increase/compensate/create/generate either order or information (from the beginning we are considering mainly thermodynamic argument under the order/information viewpoint). Energy input can compensate for energy output only. So the usual escamotage of the "open system" doesn't work. Besides if Earth is an open system respect the sun, (Earth+sun) is a closed system, then thermodynamic of closed system applies.

Objection #13

If the simple thermodynamic argument applied to Darwinism, it would also apply to our own inner workings, proving that we could not exist either.

Answer #13

Not true. Our own inner workings agree perfectly with thermodynamic. Energy input compensates energy output. Entropy/disorder is controlled (within certain limits) by the countless mechanisms designed to that specific goal. When these limits are unfortunately exceeded and entropy/disorder is unavoidably out of control organic systems crash and at the end die.

Objection #14

The discussions about energy, entropy and evolution (or the OOL problem) be framed and discussed in terms of either the Helmholtz or Gibbs free energy: A = E - TS or G = H - TS where E is the system energy, H is the system enthalpy, S is the system entropy and T is the absolute temperature. The criterion of spontaneity is then given in terms of changes in A and G (dA and dG), dA or  dG < 0 (for a spontaneous change). Thus, using either function we can predict if a proposed physic-chemical transformation will be spontaneous or non-spontaneous. Importantly, the two functions clearly imply that a process can be both spontaneous and disentropic, provided the enthalpy change (or energy change) is sufficiently favorable.

Answer #14

The equations put on the table are some of the basic laws of chemistry. Gibbs free energy dG = dH - TdS applies to all chemical reactions. In the equation "dS" represent the thermodynamic (or physical) entropy. SLOT, the Law of Disorder, states that spontaneous systems always go in the direction of increasing entropy. Total entropy  (dSsys + dSsurroundings) always increases, and the global condition is always dStotal>0. SLOT is a basic and very general law. The term "dH" represents enthalpy, i.e. the delta of energy involved.

 

What distance is there from this basic chemical-physical layer to the layer of life? To give an idea we believe the best thing to do is to consider the informatics analogy. What distance is there from Boolean algebra (or other low-level byte operations) to Windows (or another modern operating system - by the way DNA is near 750Mb, as an OS)? Both are running into computers. Informaticians would say that between the two there are many "software-layers". In other words in the middle there are many order/level/degree/layer of complexity. Analogously the problem of life is not simply a matter of complexity but a matter of many complexity-layers. We might say (for convention) that basic chemical laws (as the Gibbs or the Hess laws) and in general all chemical and physical laws stay at the complexity-layer #0. They are simple constant rules or instructions (software) that the cosmos (hardware) has to run. We could say that the compounds obtained from chemical reactions stay at the complexity-layer #1. For example the amino acids that in the early 1950s S.Miller obtained in his lab by mean of his famous experiment stay at the complexity-layer #1. Instead life stays at the complexity-layer #N, where N is >> 1. So Miller was very far from having "created" life in the lab. Life involves a "stack" or hierarchy of a huge number of complexity-orders. We can say that whether N is the top complexity-order (that of man's body, just for being clear), and M is the complexity-layer of a single cell or unicellular organism, just M is >> 1.

 

Chemical/physical laws (layer #0) are the causes of the rise of layer #1 (all simple chemical compounds) but are unable to cause the next layers of the stack (from #2 to #N). Roughly speaking we might say that amino acids stay at the complexity-layer #1 meanwhile proteins just stay at the complexity-layer #2. Just proteins cannot arise spontaneously. Because, from layer #2 to the top, information (CSI) is needed. SLOT is obviously valid at every level of the stack. Said other way, SLOT works for disorder at every level of the stack. There is no layer on which to win disorder is easier. Second law entails that spontaneous order is impossible at every level of the complexity hierarchy. Complexity stack is an information stack. Second law job is to destroy information at every level. While intelligent agency creates and generates information, SLOT does exactly the opposite, it destroys information. SLOT is like an engine for destroying information and order.

 

What about energy? Energy appears in Gibbs' equation under the form of enthalpy dH. As such it can help to render a chemical reaction spontaneous or not. In other words, energy can help chemical reactions to occur. Energy can help obtaining chemical compounds. We can say that energy "works" until the complexity-layer #1 only. In this sense we have said, "energy cannot create order/information". Since order/information spam over the entire complexity-stack and instead chemical compounds stay at the complexity-layer #1 only, then energy alone cannot account for life. When evolutionists say, "sun gives energy for life" they indeed mean that sun started life on Earth and helped to create new species! But all that is dead wrong. How G. Sewell says: "I don't see come from sun neither DNA, nor libraries, nor computers!" Sun, and in general energy, accounts for the rise and develop of living species exactly 1 respect N orders of magnitudo (where N is >> 1). And it must be clear that more sun doesn't mean to add +1 many times, so we may arrive to N. Sun/energy stays and remains always at the complexity-layer #1. So the based-on-sun hope of evolutionists is fully illusory. Evolutionists confuse energy/matter with information. Energy/matter carry on information only. But to carry on information is a completely different thing respect to generate information. A sheet of paper can carry on a message but cannot itself generate the message. Energy/matter have pure passive role respect information. As such they work only at the layer #1 of the entire information stack.

 

We can also see the issue from another point of view. In the Gibbs' equation sun/energy can help the enthalpy dH but cannot exclude or modify the entropy dS. Sun/energy cannot order or organize. Entropy dS is embedded in the complexity-layer #0 and cannot be modified by sun/energy. Entropy dS, i.e. disorder and randomness, is exactly what prevents of having the complexity-layer stack gratis. The complexity-layer stack from layer #1 to layer #N has to be designed and constructed by the intervention of intelligence.

 

What about information? Information doesn't appear in Gibbs' equation. But anyway also information stays in the business because is present in the entire complexity-layer stack from layer #0 to layer #N. Thermodynamic (or physical) entropy dS has is counterpart in the information entropy. Both are the two faces of the same coin, the coin of disorder and randomness. Simply we can look at disorder under its physical aspect or we can look at disorder under its information aspect. Thermodynamics states the coin of disorder (with its two faces, physical entropy and information entropy) always grows.

 

Now let's arrive to neo-Darwinism. RM/NS neo-Darwinism hypothesizes that all the complexity-layer stack from layer #1 to layer #N is due to random mutations. As such it is a meta-proposition about the arise of the entire stack. It says that the stack arose thank to randomness only. It says that the stack arose spontaneously, without any necessity of intelligence. But as we saw above in the stack, SLOT always works for disorder. Neo-Darwinism pretends to have the complexity-layer stack gratis and, as we saw above, the entropy engine is exactly what prevents of having the complexity-layer stack gratis. Neo-Darwinism pretends "free lunch" and SLOT is exactly the toll man who requests the lunch's fee. As Dembski says: "no free lunch". Evolutionism and SLOT are mathematically incompatible.

 

Let's examine the Darwinian engine: "random mutations". Then the claim: "random mutations generate order". SLOT: "randomnes generate disorder ". Neo-Darwinism and SLOT say exactly the opposite. Let's return again to the informatics metaphor. All informaticians know that random mutations in software generate always disorder. Any random mutation in software always generates malfunctioning and very often even halts the computer. This too is due to entropy, which of course applies to software as whatever else. Biological information stack is much more complex than any computer ever constructed. Bio random mutations are - at the best - useless and - at the worse - dangerous. The usual objection of Darwinists: but there is also natural selection. Unfortunately as informaticians say: trash input, trash output. If the random mutations pre-processor inputs trash the natural selection post-processor unavoidably outputs trash.

Objection #15

It is possible (in principle, so far as anyone knows at present) that under some special but still not-too-improbable set of circumstances, a small hunk of seawater might have decreased in entropy enough to generate proto-life, while a compensating increase of entropy took place in the surroundings.

Answer #15

What one calls "proto-life" is quite near having the complexity-layer #M >> 1. A simple local decreasing in entropy cannot achieve that high level. In fact it is very important to understand that whether an increase of entropy means destroying information, a decreasing in entropy (neghentropy) doesn't mean creating CSI. The generator of CSI is always intelligence. Sand castles are usually destroyed by entropy but nobody ever saw a sand castle constructed by neghentropy!

Objection #16

To assert that putting hash marks in a "continuum" magically makes them real and impenetrable boundaries, is no proof at all. It is precisely the point of Darwinism that such boundaries can be surmounted in small steps. I don't believe that this can be proved to be so thermodynamically improbable as to never have occurred within the age of earth. The question is whether or not there is a Darwinian continuum.

Answer #16

It isn't necessary to investigate under the molecules and atoms level until arriving to quantum particles and events for disproving neo-Darwinism. Is it necessary to investigate the C-MOS gates of the hardware of the computer for disproving an assertion like this: "all software derived from unguided spontaneous bit-wise processes"? (An assertion that, for the record, no programmer would never do). Our answer is: it isn't necessary at all. Because a random or pseudo-random generated software is simply unthinkable. Programmers are convinced of this because they know that randomness simply doesn't work when the business is to make running an application on a stupid and blind computer. They don't need to know the chemistry and the physics of the C-MOS gates. They need only a simple and unpretentious acknowledgment: randomness generates always dirty bits. Then programmers have to program themselves. Thermodynamics provides exactly that acknowledgment at the macro-level of reality in all the fields beyond informatics. That's all. In other words SLOT is a simple and unpretentious rule that is just sufficient to disproof some gross and sophisticated errors, as Darwinism is.

Darwinian "small steps" are exactly those "random mutations" that are or neutral or dangerous. As a sum composed of zero (neutral mutations) plus negative numbers (mutations dangerous) cannot give a positive number, so neutral/dangerous mutations cannot generate favorable events. So a "Darwinian continuum" would be simply a "continuous" loss.

Objection #17

It's thermodynamically possible to develop complex living forms.

Answer #17

To counter this objection it's worth reading the book of Charles B. Thaxton, Walter L. Bradley, Roger L. Olsen "The Mystery of Life's Origin" - especially chapters 7, 8 ,9.[20]

 

At the beginning of chapter 8 "Thermodynamics and the Origin of Life" there is the following paragraph: "(1) Peter Molton has defined life as "regions of order which use energy TO MAINTAIN their organization against the disruptive force of entropy."1 (2) In Chapter 7 it has been shown that energy and/or mass flow through a system can constrain it far from equilibrium, resulting in an increase in ORDER. (3) Thus, it is thermodynamically possible to develop complex living forms, assuming the energy flow through the system can somehow be effective in ORGANIZING the simple chemicals into the complex arrangements associated with life."

 

Commenting this paragraph will allow us to understand because there is the misunderstanding and the evolutionist illusion that an open system that receives energy from the surroundings can develop life from inorganic matter (abiogenesis) or can improve organization on pre-existent organisms. We have numbered the propositions and we have uppercased the three key words of the issue: "TO MAINTAIN", "ORDER" and  "ORGANIZING".

 

(1) The first proposition by Peter Molton is correct but too much synthetic: "energy is able to MAINTAIN a pre-existent organization". In the case of life it would be more correct to say: "energy provides the power supply to the (intelligent designed) mechanisms that maintain organization in the organisms!"

 

(2) The second proposition is correct but we have to carefully question what the term "ORDER" means there. Sure it means a decrease of entropy. It is usual in chemistry and physics literature to identify: "increase of entropy = disorder" and "decrease of entropy = order". But there are many types of "ORDER". Order is a qualitative attribute that cannot be simply reduced to quantity. The order of molecules in a crystal is fully different from the order of DNA (that contains coded information). DNA order is more qualitative than the order of a simple crystal because the order of DNA contains much information, meanwhile the simple crystal doesn't. DNA has an order that is hierarchical higher than the order of the crystal. So the decrease of entropy that the flow of energy causes is unable to generate that kind of qualitative and higher order DNA contains. The qualitative order of DNA is exactly what IDT calls CSI.

 

(3) The above fundamental ascertainment makes us to understand that the third proposition is wrong. To say "energy flow through the system can somehow be effective in ORGANIZING the simple chemicals into the complex arrangements associated with life" is unsound because, as we saw above, to organize life entails CSI, and CSI is a kind of order that energy cannot achieve. Energy cannot "organize" when organization means CSI (necessary for life). Besides is wrong to say that "it is thermodynamically possible to develop complex living forms" because "to develop complex living forms" is "to organize" life and we have just said energy is unable to do that. Instead it would have been correct to say: "it is thermodynamically possible that - assuming the energy flow through the system - entropy decreases. Period."

 

So the inference "decrease of entropy -> CSI" is wrong. This misunderstanding, that the quoted paragraph represents so well, is the very reason of the evolutionist illusion that "life arose because Earth received energy from the sun". Evolutionists, excluding an intelligent agency, need a substitute for it and have no other chance that energy (which cannot achieve the goal). This misunderstanding is due to the fact evolutionists don't make the necessary qualitative distinctions among the many kind of order. (That also depends on a conception of science too much quantitative and reductionist.) They have used the pretest of the usual definition "decrease of entropy = order" to deduce that tout court ANY kinds of order - also the highest one - might arise from matter thank to the contribution of energy. But quality/essence cannot be reduced to quantity/substance, or - viceversa - quantity/substance cannot be converted in quality/essence. Matter/energy stay on the side of quantity/substance and information/CSI stay on the side of quality/essence.

 

The arise of life implies CSI, which energy alone cannot provide. Energy can decrease entropy but the decrease of entropy doesn't mean CSI. OOL (Origin Of Life) is the development of life from nothing (in a sense we could say OOL is an "absolute" problem). Origin of species (OOS) is the development of different kinds of organisms (in a sense we could say OOS is an "incremental" problem). But anyway also origin of species entails CSI. Meanwhile OOL implies the arise of CSI from null, OOS implies an increment of CSI from a pre-existent minor CSI. OOL and OOS are de facto the same problem: how CSI arose, who provides CSI, what can create CSI? Then a disprove of a spontaneous OOL is in the same time a disprove of a spontaneous OOS (i.e. neo-Darwinism).

Objection #18

Energy can generate order.

Answer #18

We had said that the "energy-can-generate-life" error is due to an oversimplified interpretation of thermodynamics and specifically an oversimplified interpretation of the term "order". "Order" is a term that entered the science of physics with the discovery of thermodynamics. "Order" has many meanings as the term "information" has (this is not odd at all, for they are strictly related). In a sense we could say that information entered science not with Shannon but just before with Clasius and Boltzmann. It had entered the back door of thermodynamics before entering the front door of information theory. In a sense evolutionists have used this as a Trojan horse for supporting abiogenesis and neo-Darwinism.

 

Unfortunately the thermodynamics statement "decrease of entropy increases order" has not been examined in depth. Meanwhile physics has investigated very well "energy" and has found all (or almost all) its many types, physics has not investigated at all the many forms of "order", also why order-information is the core business of other sciences. Anyway this has enormously helped evolutionists to support their "energy-can-generate-life" error. The simple evolutionist reasoning is: look at that nice energy that, decreasing entropy, can increase order until arriving to the order of life. But our response must be: of what "order" are you speaking about? Exactly as we would do if they would be speaking of energy: what kind of energy? For evolutionists order is a continuum concept from its simplest forms to its highest forms. So we must explain instead that order is a hierarchical-stepped concept, not at all continuous.

 

Let's try to examine some kind of order at the microscopic level. The following is a raw and incomplete list but can just help.

 

(a) The order of crystals. For example: when water is transformed into ice the local system passes from a high entropy, disordered state (water) to a low entropy more ordered state (crystals of ice).

(b) The order of some far-from-equilibrium low entropy systems studied by I.Prigogyne and others.

(c) The order of macromolecules like proteins, RNA, DNA.

(d) The order of processors of the macromolecules above, like ribosomes and other cellular machines.

(e) The order of cell considered as a self-sufficient system able to self-reproduce and sel-survive (this order- level overtops and manages the under laying order-levels (c-d)).

 

For evolutionists the set from (a) to (e) - briefly (a-e) - is a continuum obtainable simply by an energy-driven entropy decrease. Why this is wrong? Order (a-b) can spontaneously arise thanks to natural laws and randomness. Natural laws contain in themselves the potentiality of generate (a-b) processes. Instead order (c-e) cannot spontaneously arise thanks to natural laws and randomness alone. Natural laws don't contain in themselves the potentiality of generate (c-e) systems. According to Dembskian IDT, order (c-e) entails complex specified information, CSI. According to algorithmic information theory (AIT, G.Chaitin, M.Kolmogoroff) CSI is "uncompressible" information, i.e. information that cannot be compressed in a simpler form or rule. Uncompressible information like DNA cannot be outputted or obtained by natural laws. For outputting DNA natural laws should contain into themselves and we know they don't. For example, consider a strand of DNA as this: CTAGGCATCATGAAATAGGAACAAATCATTTAG ...

No laws of chemistry or physics contain that string. They don't contain any of the other organic macromolecules descriptions. Since that string is uncompressible, laws, which are very simple algorithms, cannot output it. Being uncompressible, no algorithms - simpler than that string itself - can generate it.

 

Evolutionists might object: whether laws are unable randomness might generate proteins and DNA. This is impossible: a processor processing information is a hierarchical system that cannot be created by chance and necessity. Order-levels (d-e) are processors. As such they cannot be generated by chance and necessity.

 

Some hypothesize that some proteins (order level c) can arise spontaneously. For what we know nobody has shown that in the lab. But the key point here is the following: just proteins have CSI because they contain  "codes" or "messages" that order-levels (d-e) must process. Since a processor can process something only if they share the code (we write and you read because we "share" the code of English language), a functional protein must "agree" with its processor and cannot be generated randomly. So it's very likely that a spontaneously generated lab protein wouldn't be functional at all. It's very likely it would be perfectly useless as a randomly generated string of characters.

 

To sum up the order hierarchy (a-e) has levels unreachable by chance and necessity, specifically (c-e). Energy, decrease of entropy, natural laws and randomness cannot reach that realm of CSI. We hope to have explained because order is a tricky matter and oversimplifying its many meanings when we are speaking about thermodynamics opens the door to the errors evolutionists do about.

 

In closing, as Roger Caillois rightly said: "Clausius and Darwin cannot both be right". This is - in nuce - the very reason because the argument from thermodynamics against evolutionism is sound.

 
 


[1] I.Prigogine, Laws of chaos, cap.2.

[2] W.A.Dembski, Intelligent Design, 6.1.

[3] C.Shannon, A mathematical theory of communication, The Bell System Technical Journal, Vol.27.

[4] O.C.De Beauregard, Le second principe de la science du temps - Irréversibilité, Entropie, Information, cap. III, 16.

[5] W.H.Zurek, Algorithmic randomness and physical entropy, Physical Review, Vol.40, N.8, October 15, 1989.

[6] L.Brillouin, Science and Information Theory.

[7] N.Wiener, Cybernetics, Introduction

[8] M.Gell-Mann, The Quark and the Jaguar, cap.16.

[9] V.Silvestrini, Che cos'è l'entropia, cap. III.

[10] R.Jastrow, The Enchanted Loom, cap.1.

[11] D.Raffard de Brienne, Pour en finir avec l'évolution, ou la faillite des théories évolutionnistes, III,4.

[12] F.Cramer, Chaos und Ordnung, cap.1.

[13] J.Monod, Chance and necessity, cap. VII.

[14] S.J.Gould, L'evoluzione della vita sulla terra, Le Scienze quaderni, n.98.

[15] R.Dawkins, The devil's chaplain, cap.2.

[16] J.Monod, Chance and necessity, cap. VII.

[17] G.Nicolis, I.Prigogine, Exploring complexity, 4.7.

[18] G.Mangiarotti, Dai geni agli organismi, cap.2.

[19] Ilya Prigogine, La Nouvelle Alitarne - Métamorphose de la science.

[20] http://www.ldolphin.org/mystery/index.html