Discussion:
2nd law clarifications
Add Reply
MarkE
2025-01-02 06:53:02 UTC
Reply
Permalink
Are these statements correct? Could they be better expressed?


Local entropy can decrease in an open system with an input of free energy.

Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.

Extant life *maintains* low local entropy through its organisation and
processes.

Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.

There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
MarkE
2025-01-02 07:11:57 UTC
Reply
Permalink
Post by MarkE
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
Out of interest, AI's assessment:


Prompt: Comment on/critique the following statements:

ChatGPT 4o: Here’s a critique and commentary on each of the provided
statements:
________________________________________
1. "Local entropy can decrease in an open system with an input of free
energy."
• Commentary:
o This statement is consistent with the second law of thermodynamics,
which applies universally to closed systems but allows for local
decreases in entropy in open systems. Living systems (and prebiotic
systems) are open systems that exchange energy and matter with their
surroundings, allowing for local entropy reduction.
o Examples include crystallization (where heat is released to the
environment) and the maintenance of biological order in living cells.
• Critique:
o The statement is accurate but could benefit from clarity. It may be
helpful to specify that while entropy can locally decrease, the overall
entropy of the system and its surroundings increases. This avoids
misinterpretation by those unfamiliar with thermodynamic principles.
________________________________________
2. "Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed."
• Commentary:
o This is an excellent point. Free energy must be harnessed and
converted into work to decrease entropy. For instance:
 Photosynthesis captures solar energy and transforms it into chemical
energy.
 Proton gradients in mitochondria are used to synthesize ATP, the
energy currency of cells.
o The emphasis on an energy capture and transformation mechanism
highlights the importance of structure and function, such as enzymes and
membranes, in enabling this process.
• Critique:
o This is well-stated and foundational to understanding how living
systems operate. One addition could be an example from abiogenesis or
early prebiotic systems, such as mineral surfaces catalyzing energy
transformations, to provide more context for the origin of such mechanisms.
________________________________________
3. "Extant life maintains low local entropy through its organisation and
processes."
• Commentary:
o This statement is correct and captures the essence of how living
organisms sustain order. Metabolic processes, such as catabolism and
anabolism, continuously balance energy input and waste output to
maintain low entropy.
o For example, cells use ATP to drive endergonic reactions, repair
structures, and create macromolecules, maintaining their complex
organization.
• Critique:
o While the statement is accurate, the term "low local entropy" might
benefit from elaboration. For example, it could mention that this refers
to the high degree of molecular organization and functional complexity
seen in living systems.
________________________________________
4. "Evolving life decreases low local entropy through the ratcheting
mechanism of natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation."
• Commentary:
o This is a strong statement linking natural selection to entropy
reduction in an evolutionary context. Natural selection indeed acts as a
"ratchet," favoring mutations that increase functional complexity and
organization, leading to cumulative improvements in adaptation and order.
o Over evolutionary time, this process has resulted in increasingly
complex systems, from single-celled organisms to multicellular life.
• Critique:
o The statement assumes that functional complexity always corresponds to
lower entropy, which may need clarification. Complexity can increase
without always leading to a direct reduction in entropy, depending on
how "entropy" is defined in the context of information theory versus
thermodynamics.
o The phrase "evolving life decreases low local entropy" could imply
that entropy always decreases during evolution, which might not be true
in all contexts. Evolution sometimes involves simplification (e.g.,
parasites losing unnecessary functions), which might not reduce entropy
in the same way.
________________________________________
5. "There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy."
• Commentary:
o Natural selection is indeed the primary mechanism known to
consistently decrease entropy in the sense of increasing functional
complexity and organization. Neutral drift, by contrast, is a stochastic
process that can result in changes that are functionally neutral or even
degrade complexity over time (increasing "entropy").
o For instance, in the absence of selection, drift can lead to the
accumulation of deleterious mutations, increasing disorder in the genome.
• Critique:
o The claim that "neutral drift alone increases entropy" is
oversimplified. While drift can increase genetic entropy (in the sense
of loss of function), it can also lead to new forms of complexity under
specific conditions. For example, neutral mutations can set the stage
for exaptation or co-option by selection later on.
o It might be worth acknowledging that other mechanisms, like epigenetic
changes, gene duplication, or horizontal gene transfer, can contribute
to functional complexity. While these processes often work alongside
natural selection, they sometimes operate semi-independently.
________________________________________
Overall Critique and Suggestions:
The statements are well-thought-out and provide a solid conceptual
framework for discussing entropy, evolution, and natural selection. However:
1. Clarity and Scope: Some terms (e.g., "low local entropy") could be
defined more explicitly to avoid ambiguity, particularly for readers
without a strong background in thermodynamics or biology.
2. Acknowledging Edge Cases: While natural selection is central to
reducing entropy in evolving life, processes like neutral drift or
simplification could be better nuanced to address exceptions.
3. Inclusion of Early Life Examples: Tying some points to prebiotic
systems or the origin of life (e.g., energy capture in early protocells)
would enhance the discussion's relevance to abiogenesis.
The overall framework is solid, and with minor adjustments, it would be
even more precise and comprehensive.
MarkE
2025-01-02 07:32:08 UTC
Reply
Permalink
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
And for anyone interested, a response to the same prompt from version
4o1 which "Uses advanced reasoning", compared with the response from
version 4o which is "Great for most tasks". Pretty similar at a glance?
_____

Thought for 5 seconds
Below is a brief commentary and critique of each statement, with an
emphasis on clarifying terminology (especially entropy) and the nuances
of biological and thermodynamic processes.
________________________________________
1. “Local entropy can decrease in an open system with an input of free
energy.”
Commentary:
• This is a standard thermodynamic principle: while overall entropy in
the universe increases, an open system can experience a local decrease
in entropy if it receives sufficient free energy from the environment.
Examples include crystallization processes, growing organisms, and the
organization within cells, all of which export heat or byproducts that
increase the entropy of their surroundings.
Critique:
• The statement is accurate but can be misread if one is not familiar
with the distinction between local and global entropy. Emphasizing that
“the total entropy (system + surroundings) still goes up” can prevent
misunderstandings.
• “Local entropy” can refer to thermodynamic entropy, informational
entropy, or the more general concept of “order.” Being explicit about
which is intended (usually thermodynamic) helps clarity.
________________________________________
2. “Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.”
Commentary:
• Merely having access to free energy (e.g., sunlight, geothermal heat)
does not guarantee that a system will organize itself or maintain lower
entropy. You need a process—such as photosynthesis or chemiosmosis—that
harnesses and transforms that energy into work.
• In living systems, enzymes, membranes, and metabolic pathways are
crucial for channeling energy into productive, entropy-lowering processes.
Critique:
• This is a well-stated point because it highlights the importance of
structure and function in living (or pre-living) systems. For example,
an RNA-based protocell requires not just nucleotides and energy sources
but also a catalytic or templating mechanism to assemble and replicate
its molecules.
________________________________________
3. “Extant life maintains low local entropy through its organisation and
processes.”
Commentary:
• Living organisms are prime examples of open systems. They maintain
internal order (low entropy) by constantly taking in free energy (food,
sunlight, etc.) and expelling waste heat and byproducts.
• Metabolic pathways, cellular repair, and reproduction all rely on
organizational frameworks that keep internal processes efficient and
structured.
Critique:
• The statement is accurate as a broad description of how life combats
entropy. However, a reader should remember that the net process
(organism + environment) still increases entropy overall.
• Additionally, “low local entropy” might be reworded to “high degree of
order,” since “low entropy” and “high order” are sometimes more
intuitive phrasing in discussions about life’s internal maintenance.
________________________________________
4. “Evolving life decreases low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.”
Commentary:
• Evolutionary theory does suggest that in many lineages, complexity and
functional organization can increase over time. Natural selection, in a
“ratchet-like” manner, preserves advantageous variations and discards
harmful ones, cumulatively leading to higher organizational states in
certain contexts (e.g., the evolution of multicellularity, complex organ
systems, etc.).
• This ties an informational notion of entropy (where more complex,
well-ordered genetic or phenotypic structures might be considered “lower
entropy”) to evolutionary mechanisms.
Critique:
• Saying “decreases low local entropy” can be confusing, because in
strict thermodynamic terms, even highly complex organisms still produce
substantial entropy in their surroundings. The statement seems to lean
more on informational or organizational entropy (sometimes discussed in
evolutionary biology).
• Evolution does not always lead to increased complexity. Some lineages
undergo simplification (e.g., parasites lose functions). Thus, the
directionality toward greater complexity is not universal, but where it
happens, selection can indeed ratchet complexity upward.
________________________________________
5. “There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.”
Commentary:
• Natural selection is indeed the main evolutionary mechanism that
systematically drives adaptive increases in complexity or organized
function. Neutral drift, by contrast, is random with respect to fitness
and can lead to loss of function or accumulation of “junk” over time,
which some might call an increase in entropy or disorder.
• From a purely adaptive standpoint, selection is the only consistent
force that pushes complexity upward (when complexity correlates with
higher fitness).
Critique:
• The statement might be too strong if taken literally. Gene
duplication, horizontal gene transfer, and other genetic processes can
contribute to increased complexity, but they typically rely on selection
to retain those beneficial changes over time. Hence, these additional
mechanisms plus selection shape complexity.
• Neutral drift does not always just increase entropy—sometimes neutral
or nearly neutral changes become exapted for new functions, later
co-opted by selection. So, while drift by itself does not drive
functional complexity, in combination with selection it can still
produce complexity.
________________________________________
Final Thoughts
Overall, the statements highlight key ideas about how living systems can
locally decrease entropy (in the sense of increasing organization) and
how evolution—powered by natural selection—can lead to increased
functional complexity. The main points to clarify or refine involve:
1. Distinguishing local from global entropy (to avoid thermodynamic
misunderstandings).
2. Differentiating between thermodynamic entropy and
informational/organizational concepts of entropy (commonly used in
evolutionary and systems biology).
3. Emphasizing that natural selection works best when combined with
other genetic mechanisms (drift, mutation, recombination, etc.) to
create new variations, even though selection is the primary driver for
preserving complex, lower-entropy (i.e., more organized) states in
living systems.
These statements, with minor refinements, provide a good conceptual
framework for understanding how open systems (like organisms) maintain
and even extend internal order over time, despite the universal increase
of entropy in closed systems.
MarkE
2025-01-02 11:31:20 UTC
Reply
Permalink
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
PS The efficacy of natural selection acting on random mutations to
ratchet complexity and reduce entropy is of course a separate topic of
debate, but assume it for this discussion.
RonO
2025-01-02 14:17:08 UTC
Reply
Permalink
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
All of this doesn't matter. The second law of thermodynamics does not
prohibit the origin of life, nor does it prohibit the evolution of life
over billions of years that it has been evolving on this planet. Being
wrong about your concepts like "neutral drift" doesn't matter because
you can't get to where you want to go with this argument. Just think,
drift obviously does not have to be neutral to selection. Drift can
obviously decrease your concept of entropy.

Entropy is always increasing whether there is an energy capture method
or not. As the entropy increases it just produces something like
molecules that can exist for a while before contributing to the
continued entropy increase. Entropy is increasing a lot as photons are
captured by plants and in their efforts to make glucose.
MarkE
2025-01-03 12:59:37 UTC
Reply
Permalink
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
All of this doesn't matter.  The second law of thermodynamics does not
prohibit the origin of life, nor does it prohibit the evolution of life
over billions of years that it has been evolving on this planet.  Being
wrong about your concepts like "neutral drift" doesn't matter because
you can't get to where you want to go with this argument.  Just think,
drift obviously does not have to be neutral to selection.  Drift can
obviously decrease your concept of entropy.
Entropy is always increasing whether there is an energy capture method
or not.  As the entropy increases it just produces something like
molecules that can exist for a while before contributing to the
continued entropy increase.  Entropy is increasing a lot as photons are
captured by plants and in their efforts to make glucose.
Further reading is warranted, e.g. this passage:

Relationship to prebiotic chemistry

In 1924 Alexander Oparin suggested that sufficient energy for generating
early life forms from non-living molecules was provided in a "primordial
soup".[31] The laws of thermodynamics impose some constraints on the
earliest life-sustaining reactions that would have emerged and evolved
from such a mixture. Essentially, to remain consistent with the second
law of thermodynamics, self organizing systems that are characterized by
lower entropy values than equilibrium must dissipate energy so as to
increase entropy in the external environment.[32] One consequence of
this is that low entropy or high chemical potential chemical
intermediates cannot build up to very high levels if the reaction
leading to their formation is not coupled to another chemical reaction
that releases energy. These reactions often take the form of redox
couples, which must have been provided by the environment at the time of
the origin of life.[33] In today's biology, many of these reactions
require catalysts (or enzymes) to proceed, which frequently contain
transition metals. This means identifying both redox couples and metals
that are readily available in a given candidate environment for
abiogenesis is an important aspect of prebiotic chemistry.

https://en.wikipedia.org/wiki/Entropy_and_life
RonO
2025-01-03 15:14:47 UTC
Reply
Permalink
Post by MarkE
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation
and processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
All of this doesn't matter.  The second law of thermodynamics does not
prohibit the origin of life, nor does it prohibit the evolution of
life over billions of years that it has been evolving on this planet.
Being wrong about your concepts like "neutral drift" doesn't matter
because you can't get to where you want to go with this argument.
Just think, drift obviously does not have to be neutral to selection.
Drift can obviously decrease your concept of entropy.
Entropy is always increasing whether there is an energy capture method
or not.  As the entropy increases it just produces something like
molecules that can exist for a while before contributing to the
continued entropy increase.  Entropy is increasing a lot as photons
are captured by plants and in their efforts to make glucose.
Relationship to prebiotic chemistry
In 1924 Alexander Oparin suggested that sufficient energy for generating
early life forms from non-living molecules was provided in a "primordial
soup".[31] The laws of thermodynamics impose some constraints on the
earliest life-sustaining reactions that would have emerged and evolved
from such a mixture. Essentially, to remain consistent with the second
law of thermodynamics, self organizing systems that are characterized by
lower entropy values than equilibrium must dissipate energy so as to
increase entropy in the external environment.[32] One consequence of
this is that low entropy or high chemical potential chemical
intermediates cannot build up to very high levels if the reaction
leading to their formation is not coupled to another chemical reaction
that releases energy. These reactions often take the form of redox
couples, which must have been provided by the environment at the time of
the origin of life.[33] In today's biology, many of these reactions
require catalysts (or enzymes) to proceed, which frequently contain
transition metals. This means identifying both redox couples and metals
that are readily available in a given candidate environment for
abiogenesis is an important aspect of prebiotic chemistry.
https://en.wikipedia.org/wiki/Entropy_and_life
"Oparin suggested that sufficient energy for generating early life forms
from non-linving molecules was provided in the "primordial soup"".

How do you think that animals generate body heat? All of this is
accounted for.

Probably more of an issue is as the high potential chemical
intermediates build up you have to maintain a low enough concentration
of them (use them for something) so that the reverse reaction that made
them does not occur and undo the effort to make them. Catalysts can
usually run the reaction both ways, and the direction of the reaction is
dependent on the concentration of starting material and product. Using
the products for other chemical reactions takes care of the issue stated
above. All reactions release energy and result in an increase in
entropy. A lot of energy is wasted when ATP is made. The chemical
energy in ATP is less than what it took to make it, and when ATP is used
the efficiency of the reaction is not 100%, so even more is lost
"released". Life still works.

As your quote indicates just the geothermal energy available would have
been enough. It looks like that after the genetic code evolved and
proteins could be the catalytic molecules that life first relied on the
salts (that spew out of geothermal vents) as an energy source (they were
chemotrophes). Life was likely chemotrophic before the genetic code
evolved. The last paper on the common ancestor of all extant lifeforms
admitted that it's ancestors were likely chemotrophes, but that that
common ancestor was either photosynthetic or on the way to evolving to
be photosynthetic because some of the components needed for
photosynthesis already existed in that common ancestor. Reversion to
chemotrophes would have had to occur in both the eubacterial and archaea
lineages. Anaerobic photosynthesis evolved first. One eubacteria
evolved oxygen generating photosynthesis, but archaea only evolved
anaerobic photosynthesis. Extant life relies on mostly the energy of
the sun, but chemotrophic ecologies still exist around geothermal vents.

What does any of this matter? Would any god involved in dissipating the
energy be the god described in the Bible? If you give up on the Bible
having anything to say about nature why would any god be needed for
abiogenesis? Does the Biblical god open up the firmament to let the
rain fall through? Denton is a Biblical creationist and he will tell
you that his god only needed to get the ball rolling with the Big Bang
and the rest unfolded into what we have. There really isn't any valid
Biblical reason why life did not originate on this planet by natural
means. Both Behe and Denton are telling you that biological evolution
is just a fact of nature. Behe just claims that his god is a tweeker
and messes with life every once in a while, but Denton thinks that, that
is unnecessary.

Ron Okimoto
erik simpson
2025-01-02 15:26:08 UTC
Reply
Permalink
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
I an open system, you can do anything from extracting or inputting energy.
Ernest Major
2025-01-02 18:13:51 UTC
Reply
Permalink
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
--
alias Ernest Major
RonO
2025-01-02 21:20:20 UTC
Reply
Permalink
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
Behe admitted in his responses to his critics at the turn of the century
(over two decades ago) that some irreducibly complex systems could
evolve by natural means. He started to claim that his type of
irreducible systems had something special that made them impossible to
evolve, or, at least, very difficult to evolve. I recall that his own
example of an IC system that could evolve by chance was the lever and
fulcrum. It just had 3 parts, but if you took away any of the parts it
would lose its function. A tree branch falling between two rocks could
make such an IC system. For biological systems it would be two proteins
that were doing their jobs in the cell, but a third protein might evolve
that brought the two initial proteins together and produce a new
function. Initially he claimed that his concept of "well matched" parts
was the important distinction for IC systems that could or could not
evolve by natural means, but he gave up on that because he never could
define "well matched" so that it could be quantified in order to
determine if any IC system had enough to be Behe's type of IC system.
Eventually Behe gave up on multiple part IC systems, and started to
claim that 3 neutral mutations that had to occur within a limited period
of time in order to create a new function would be his type of IC
system. Mutations in a single protein could fit the bill, but Behe
never found any examples. As stupid as it may be he started denigrating
the systems that had been identified that had evolved with two neutral
mutations by claiming that those systems were "on the edge of
evolution". He admitted that natural mechanisms did account for these
examples, but that evolution would never do better than that, and
produce systems that required 3 neutral mutations. It was a stupid
argument because it just meant that Behe has never found his type of IC
system existing in nature.

So early on Behe had admitted that the mechanisms his detractors had put
forward for evolving IC systems could do that, but he would not give up
on the concept, and just tried to add things to it, but was never able
to verify that any of the additions had ever been involved with the
evolution of any of Behe's IC systems. He was never able to verify that
his blood clotting, adaptive immune system, nor flagellum ever needed
enough "well matching" to matter, and he never identified his 3 neutral
mutations involved with any of those systems.

IC has been a total failure for any of Behe's attempts to keep it alive.

Ron Okimoto
MarkE
2025-01-03 12:24:44 UTC
Reply
Permalink
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."

More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.

Does this necessarily mean entropy will increase? It would seem so.
Kerr-Mudd, John
2025-01-03 12:52:40 UTC
Reply
Permalink
On Fri, 3 Jan 2025 23:24:44 +1100
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
Over time selection always operates, there's rarely a free lunch.
Post by MarkE
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
I don't think you "get" evolution; if it's a neutral change then it
might survive, if it's detrimental then it won't [unless there's a
compensating benefit], if it's beneficial then (given lack of
meteorites, global warming, ice-ages changes to feedstock, changes to
predators, etc etc) then it'll survive.
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
Is this God anti-entropy? How come there's a lot of it about?
--
Bah, and indeed, Humbug
MarkE
2025-01-03 13:08:34 UTC
Reply
Permalink
Post by Kerr-Mudd, John
On Fri, 3 Jan 2025 23:24:44 +1100
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
Over time selection always operates, there's rarely a free lunch.
Post by MarkE
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
I don't think you "get" evolution; if it's a neutral change then it
might survive, if it's detrimental then it won't [unless there's a
compensating benefit], if it's beneficial then (given lack of
meteorites, global warming, ice-ages changes to feedstock, changes to
predators, etc etc) then it'll survive.
Wrong. Near-neutral (i.e. mildly detrimental changes) by definition have
a very low selection co-efficient and therefore typically will not be
removed by selection.
Post by Kerr-Mudd, John
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
Is this God anti-entropy? How come there's a lot of it about?
The entropy of the universe as a closed system can only decrease.
Therefore, where did the initial low entropy state of the universe come
from?
MarkE
2025-01-03 13:11:21 UTC
Reply
Permalink
Post by MarkE
Post by Kerr-Mudd, John
On Fri, 3 Jan 2025 23:24:44 +1100
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
Over time selection always operates, there's rarely a free lunch.
Post by MarkE
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
I don't think you "get" evolution; if it's a neutral change then it
might survive, if it's detrimental then it won't [unless there's a
compensating benefit], if it's beneficial then (given lack of
meteorites, global warming, ice-ages changes to feedstock, changes to
predators, etc etc) then it'll survive.
Wrong. Near-neutral (i.e. mildly detrimental changes) by definition have
a very low selection co-efficient and therefore typically will not be
removed by selection.
Post by Kerr-Mudd, John
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
Is this God anti-entropy? How come there's a lot of it about?
The entropy of the universe as a closed system can only decrease.
Therefore, where did the initial low entropy state of the universe come
from?
Correction: "can only increase"
Ernest Major
2025-01-03 16:05:49 UTC
Reply
Permalink
Post by MarkE
Wrong. Near-neutral (i.e. mildly detrimental changes) by definition have
a very low selection co-efficient and therefore typically will not be
removed by selection.
As an allele approaches a selection coefficient of zero the chance of
fixation (and therefore also elimination) approaches 50%. The chance of
a mildly detrimental change being removed (by the combination of drift
and selection) remains greater than 0.5.

If you want to consider genetic drift acting alone, you have to not only
eliminate natural selection, but also mutation and other sources of
variation. In this case a population will evolve to eliminate genetic
variation and then enter stasis.
--
alias Ernest Major
MarkE
2025-01-04 13:00:31 UTC
Reply
Permalink
Post by Ernest Major
Post by MarkE
Wrong. Near-neutral (i.e. mildly detrimental changes) by definition
have a very low selection co-efficient and therefore typically will
not be removed by selection.
As an allele approaches a selection coefficient of zero the chance of
fixation (and therefore also elimination) approaches 50%. The chance of
a mildly detrimental change being removed (by the combination of drift
and selection) remains greater than 0.5.
If you want to consider genetic drift acting alone, you have to not only
eliminate natural selection, but also mutation and other sources of
variation. In this case a population will evolve to eliminate genetic
variation and then enter stasis.
Population genetics is an interesting and difficult area (not
necessarily disputing your statement above).

In pondering this issue, I wrote a program to simulate a population
(~10,000) subject to sexual reproduction, with chromosomes,
recombination etc.

I realised that it was in part a kind of reverse-engineering
investigation of what distributions of selection coefficients would lead
to fixation, extinction etc. E.g., empirically estimated distributions
concentrate the majority of mutations just less than 0 (near-neutral
deleterious), some as -1 (lethal), and a very small proportion just
above 0 (near-neutral beneficial).

I'm still working on it. Tricky to be sure that assumptions and
simulation are valid.
Kerr-Mudd, John
2025-02-07 14:49:35 UTC
Reply
Permalink
On Sat, 4 Jan 2025 00:08:34 +1100
Post by MarkE
Post by Kerr-Mudd, John
On Fri, 3 Jan 2025 23:24:44 +1100
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
Over time selection always operates, there's rarely a free lunch.
Post by MarkE
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
I don't think you "get" evolution; if it's a neutral change then it
might survive, if it's detrimental then it won't [unless there's a
compensating benefit], if it's beneficial then (given lack of
meteorites, global warming, ice-ages changes to feedstock, changes to
predators, etc etc) then it'll survive.
Wrong. Near-neutral (i.e. mildly detrimental changes) by definition have
a very low selection co-efficient and therefore typically will not be
removed by selection.
Post by Kerr-Mudd, John
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
Is this God anti-entropy? How come there's a lot of it about?
The entropy of the universe as a closed system can only decrease.
Therefore, where did the initial low entropy state of the universe come
from?
Only proper He God could have done it. But does He also have to keep
poking away at evolution to get to send a message to some wayward tribe
in the Middle East? Wondrous ways or wot?

Obviously his Chosen People are not quite in control back home, but
they're getting there. Not sure there's a heap of neighbourly love
going on though. I blame Moses. Some JWs claim World Peace is possible,
dunno how they square that with current affairs.
--
Bah, and indeed, Humbug
Rufus Ruffian
2025-01-03 13:38:05 UTC
Reply
Permalink
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Capture and transform into what, something useful?
Who defines what's useful? It's subjective, isn't it?

The sun beats down on surface rocks in the daytime. They get hot. At
night, the heat spreads downward and evens out the temperature, and
entropy is reclaimed. The entropic tide rises and falls, in a relative
way. Who needs to capture and transform anything?
Post by MarkE
Post by Ernest Major
Post by MarkE
Extant life *maintains* low local entropy through its organisation and
processes.
Life blows through energy like a hungry kid in a mcdonalds, leaving a
trail of entropy in its wake. Even green plants do so.

You are taking the old "entropy = disorder" meme too seriously. It's
just a simplistic conceptualization. Entropy is the change in system
energy divided by the system temperature. Details at
<https://en.wikipedia.org/wiki/Entropy>
Post by MarkE
Post by Ernest Major
Post by MarkE
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
How many joules per kelvin are there in "functional complexity and
organization"?
Post by MarkE
Post by Ernest Major
Post by MarkE
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
[citation needed]

Life generates the entropy it's going to generate, without regard for
anthropomorphic concepts like drift and neutrality, let alone
complexity, let alone "intelligence". As far as thermodynamics is
concerned, those are not even things. Think joules per kelvin, old
buddy.
Post by MarkE
Post by Ernest Major
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
It's good to see a man who can be wrong with confidence.
Post by MarkE
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
If devolution happens, it's not exactly neutral, is it?

If a population extincts itself, then mucho selection has occurred, has
it not?

Again, how many joules per kelvin are consumed by the loss of
"information"?
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
No. Entropy increases because that's what entropy does. It doesn't care
than remarkable life forms are constructed along the way.
MarkE
2025-01-04 12:44:26 UTC
Reply
Permalink
Post by Rufus Ruffian
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Capture and transform into what, something useful?
Who defines what's useful? It's subjective, isn't it?
The sun beats down on surface rocks in the daytime. They get hot. At
night, the heat spreads downward and evens out the temperature, and
entropy is reclaimed. The entropic tide rises and falls, in a relative
way. Who needs to capture and transform anything?
Post by MarkE
Post by Ernest Major
Post by MarkE
Extant life *maintains* low local entropy through its organisation and
processes.
Life blows through energy like a hungry kid in a mcdonalds, leaving a
trail of entropy in its wake. Even green plants do so.
You are taking the old "entropy = disorder" meme too seriously. It's
just a simplistic conceptualization. Entropy is the change in system
energy divided by the system temperature. Details at
<https://en.wikipedia.org/wiki/Entropy>
Post by MarkE
Post by Ernest Major
Post by MarkE
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
How many joules per kelvin are there in "functional complexity and
organization"?
Post by MarkE
Post by Ernest Major
Post by MarkE
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
[citation needed]
Life generates the entropy it's going to generate, without regard for
anthropomorphic concepts like drift and neutrality, let alone
complexity, let alone "intelligence". As far as thermodynamics is
concerned, those are not even things. Think joules per kelvin, old
buddy.
Post by MarkE
Post by Ernest Major
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
It's good to see a man who can be wrong with confidence.
Post by MarkE
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
If devolution happens, it's not exactly neutral, is it?
You're confusing the relative neutrality of a near-neutral mutation with
a cumulative population effect over time.
Post by Rufus Ruffian
If a population extincts itself, then mucho selection has occurred, has
it not?
Again, how many joules per kelvin are consumed by the loss of
"information"?
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
No. Entropy increases because that's what entropy does. It doesn't care
than remarkable life forms are constructed along the way.
Universally, of course. Locally, not necessarily. Would you agree that
evolution produces a local decrease in entropy?
Rufus Ruffian
2025-01-04 16:47:33 UTC
Reply
Permalink
Post by MarkE
Post by Rufus Ruffian
Again, how many joules per kelvin are consumed by the loss of
"information"?
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
No. Entropy increases because that's what entropy does. It doesn't care
than remarkable life forms are constructed along the way.
Universally, of course. Locally, not necessarily. Would you agree that
evolution produces a local decrease in entropy?
No, because it doesn't. My whole point was that you (like most
creationists) fundamentally and perhaps deliberately misunderstand the
whole 2nd law concept. Apparently the point whooshed you.

So again, how many joules per kelvin are consumed by the loss (or gain)
of "information"?
MarkE
2025-01-05 05:51:02 UTC
Reply
Permalink
Post by Rufus Ruffian
Post by MarkE
Post by Rufus Ruffian
Again, how many joules per kelvin are consumed by the loss of
"information"?
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
No. Entropy increases because that's what entropy does. It doesn't care
than remarkable life forms are constructed along the way.
Universally, of course. Locally, not necessarily. Would you agree that
evolution produces a local decrease in entropy?
No, because it doesn't. My whole point was that you (like most
creationists) fundamentally and perhaps deliberately misunderstand the
whole 2nd law concept. Apparently the point whooshed you.
So again, how many joules per kelvin are consumed by the loss (or gain)
of "information"?
At least 2.9×10−21 joules per bit.

That is, if Landauer's principle is applicable*, "which states that the
minimum energy needed to erase one bit of information is proportional to
the temperature at which the system is operating. Specifically, the
energy needed for this computational task is given by

E ≥ kB.T.ln(2)
where
kB is the Boltzmann constant and
T is the temperature in Kelvin"

https://en.wikipedia.org/wiki/Landauer%27s_principle

* Not certain if/how; nevertheless I am intrigued by the relationship
between information and energy quantified by Landauer's principle.


This topic is difficult and sometimes misunderstood by both sides IMO,
unintentionally or otherwise. My own understanding is incomplete - happy
for an open discussion.


RELATIONSHIP TO PREBIOTIC CHEMISTRY

"Essentially, to remain consistent with the second law of
thermodynamics, self organizing systems that are characterized by lower
entropy values than equilibrium must dissipate energy so as to increase
entropy in the external environment.[32] One consequence of this is that
low entropy or high chemical potential chemical intermediates cannot
build up to very high levels if the reaction leading to their formation
is not coupled to another chemical reaction that releases energy. These
reactions often take the form of redox couples, which must have been
provided by the environment at the time of the origin of life."
https://en.wikipedia.org/wiki/Entropy_and_life

That is, an ensemble of organised pre/proto-life molecules (e.g. a
protocell) are localised low entropy. To be sure, the surrounding
environment pays the price for this with an even greater increase in
entropy.


CONFIGURATION ENTROPY

"Configuration entropy can be calculated using the Boltzmann entropy
equation:
𝑆 = 𝑘𝐵.ln(𝑊)

where:
𝑆 is Entropy
𝑘𝐵 is the Boltzmann constant
𝑊 is the Number of possible microstates (configurations)
consistent with the macrostate

Microstates and Macrostates:

Microstate: A specific arrangement of the system's components.
Macrostate: The overall state of the system, described by observable
properties like temperature, pressure, or composition.
A macrostate with a larger number of possible microstates has higher
configuration entropy." (ChatGPT-4o; also
https://en.wikipedia.org/wiki/Configuration_entropy)

For an ensemble of molecules that could form a living entity, the
"nonliving macrostate" is composed of many more possible microstates
(Wn) than the "living macrostate" (Wl). E.g., if say Wn/Wl > 10^10, then
the local configuration entropy *decrease* to go from nonliving to
living is

Sn - Sl = kB(ln(Wn) - ln(Wl))
= kB.ln(Wn/Wl)
= kB.ln(10^10)
= 3.2E-22 joules per kelvin

Which incidentally could be used to show improbability of the
spontaneous formation of life (not suggesting that is being claimed).
DB Cates
2025-01-04 16:51:01 UTC
Reply
Permalink
Post by MarkE
Post by Rufus Ruffian
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Capture and transform into what, something useful?
Who defines what's useful?  It's subjective, isn't it?
The sun beats down on surface rocks in the daytime. They get hot. At
night, the heat spreads downward and evens out the temperature, and
entropy is reclaimed. The entropic tide rises and falls, in a relative
way. Who needs to capture and transform anything?
Post by MarkE
Post by Ernest Major
Post by MarkE
Extant life *maintains* low local entropy through its organisation and
processes.
Life blows through energy like a hungry kid in a mcdonalds, leaving a
trail of entropy in its wake. Even green plants do so.
You are taking the old "entropy = disorder" meme too seriously. It's
just a simplistic conceptualization. Entropy is the change in system
energy divided by the system temperature. Details at
<https://en.wikipedia.org/wiki/Entropy>
Post by MarkE
Post by Ernest Major
Post by MarkE
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
How many joules per kelvin are there in "functional complexity and
organization"?
Post by MarkE
Post by Ernest Major
Post by MarkE
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
[citation needed]
Life generates the entropy it's going to generate, without regard for
anthropomorphic concepts like drift and neutrality, let alone
complexity, let alone "intelligence".  As far as thermodynamics is
concerned, those are not even things. Think joules per kelvin, old
buddy.
Post by MarkE
Post by Ernest Major
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
It's good to see a man who can be wrong with confidence.
Post by MarkE
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
If devolution happens, it's not exactly neutral, is it?
You're confusing the relative neutrality of a near-neutral mutation with
a cumulative population effect over time.
Post by Rufus Ruffian
If a population extincts itself, then mucho selection has occurred, has
it not?
Again, how many joules per kelvin are consumed by the loss of
"information"?
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
No.  Entropy increases because that's what entropy does. It doesn't care
than remarkable life forms are constructed along the way.
Universally, of course. Locally, not necessarily. Would you agree that
evolution produces a local decrease in entropy?
I would not agree with that. You need to discard the idea that 'disorder
means higher entropy'. My favourite example is rust. Iron rust has a
lower "local" entropy that the iron+oxygen that combine to produce it.
(see https://www2.oberlin.edu/physics/dstyer/P111/EntropyRust.pdf)
Evolution changes aspects of life but doesn't (locally) reduce entropy.
Life (locally) reduces entropy but at the expense of greatly increasing
entropy less locally. Think of life as God's way of speeding up the heat
death of the universe.
--
--
Don Cates ("he's a cunning rascal" PN)
MarkE
2025-01-05 06:13:12 UTC
Reply
Permalink
Post by DB Cates
Post by MarkE
Post by Rufus Ruffian
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Capture and transform into what, something useful?
Who defines what's useful?  It's subjective, isn't it?
The sun beats down on surface rocks in the daytime. They get hot. At
night, the heat spreads downward and evens out the temperature, and
entropy is reclaimed. The entropic tide rises and falls, in a relative
way. Who needs to capture and transform anything?
Post by MarkE
Post by Ernest Major
Post by MarkE
Extant life *maintains* low local entropy through its organisation and
processes.
Life blows through energy like a hungry kid in a mcdonalds, leaving a
trail of entropy in its wake. Even green plants do so.
You are taking the old "entropy = disorder" meme too seriously. It's
just a simplistic conceptualization. Entropy is the change in system
energy divided by the system temperature. Details at
<https://en.wikipedia.org/wiki/Entropy>
Post by MarkE
Post by Ernest Major
Post by MarkE
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and
organisation.
How many joules per kelvin are there in "functional complexity and
organization"?
Post by MarkE
Post by Ernest Major
Post by MarkE
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
[citation needed]
Life generates the entropy it's going to generate, without regard for
anthropomorphic concepts like drift and neutrality, let alone
complexity, let alone "intelligence".  As far as thermodynamics is
concerned, those are not even things. Think joules per kelvin, old
buddy.
Post by MarkE
Post by Ernest Major
It is difficult to operationalise the concept of irreducible complexity,
as that necessitates a principled definition of system, part and
function. But if you pass over that point, there are at least three
classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity and
organisation with a decrease in entropy, then this would negate a claim
that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
It's good to see a man who can be wrong with confidence.
Post by MarkE
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
If devolution happens, it's not exactly neutral, is it?
You're confusing the relative neutrality of a near-neutral mutation
with a cumulative population effect over time.
Post by Rufus Ruffian
If a population extincts itself, then mucho selection has occurred, has
it not?
Again, how many joules per kelvin are consumed by the loss of
"information"?
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
No.  Entropy increases because that's what entropy does. It doesn't care
than remarkable life forms are constructed along the way.
Universally, of course. Locally, not necessarily. Would you agree that
evolution produces a local decrease in entropy?
I would not agree with that. You need to discard the idea that 'disorder
means higher entropy'. My favourite example is rust. Iron rust has a
lower "local" entropy that the iron+oxygen that combine to produce it.
(see https://www2.oberlin.edu/physics/dstyer/P111/EntropyRust.pdf)
Evolution changes aspects of life but doesn't (locally) reduce entropy.
Life (locally) reduces entropy but at the expense of greatly increasing
entropy less locally. Think of life as God's way of speeding up the heat
death of the universe.
That's interesting - a cautionary example re assumptions about entropy
changes. What response did the letter receive do you know?

Regardless, we seem to be in agreement that life increases entropy locally.

You're welcome to comment on my recent "exploratory" response to Rufus
Ruffian on this:
snews://news.eternal-september.org:563/vld6k5$t7a6$***@dont-email.me
Ernest Major
2025-01-03 15:57:35 UTC
Reply
Permalink
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation
and processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible
complexity, as that necessitates a principled definition of system,
part and function. But if you pass over that point, there are at least
three classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity
and organisation with a decrease in entropy, then this would negate a
claim that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
While chopping evolutionary processes into 2 categories (variation and
differential reproductive success), or 4 categories (mutation, gene
flow, selection and drift), or more, is useful for explaining the
overall process, it is necessary to consider the processes in concert
when evaluating the capabilities of evolution. While I consider the
claim that neutral drift alone increases disorder to be at best
unproven, it is a diversion from the question as to the contribution of
neutral drift to constructive disorder. (For example does neutral drift,
by opening up a greater amount of sequence space, make natural selection
more effective?)
Post by MarkE
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
Consider a neutral change that causes a protein to attach to another
protein, without effecting its functionality. What is to prevent a
further series of neutral changes resulting in it being unable to
perform its function unless attached to the second protein. Is that not
an increase in complexity?
Post by MarkE
Does this necessarily mean entropy will increase? It would seem so.
There are proposals that in the presence of energy flows matter arranges
itself into structures (such as life) that increase the rate of entropy
production.
--
alias Ernest Major
RonO
2025-01-03 15:56:16 UTC
Reply
Permalink
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation
and processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible
complexity, as that necessitates a principled definition of system,
part and function. But if you pass over that point, there are at least
three classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity
and organisation with a decrease in entropy, then this would negate a
claim that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
Does this necessarily mean entropy will increase? It would seem so.
It would be non neutral drift that would likely be associated with
increased disorder. Even if something is selected against it can still
be fixed in a small population by factors not directly affecting the
trait under selection. 90% of a population might be wiped out by some
disease and the survivors may just, by chance, have a high frequency of
some deleterious allele that might get fixed by random chance.

Neutral drift is just neutral changes, it shouldn't result in any
increase in disorder because those changes are selected against. Look
at the coelacanth it has been adapted to a specific environment for
hundreds of millions of years, and has changed very little, and yet
genetic drift in their DNA sequence and physical features invisible to
the environment have changed, but the outward physical appearance has
not degenerated. Drift has occurred even as the morphology has been
maintained. The skull may make nearly identical fossil impressions, but
when you look at the skull you observe that all the bones that make up
the skull have very different sizes and shapes, but still make the same
overall skull shape. This is neutral drift.

Drift can result in loss of gene functions that are no longer needed in
certain environments. The eyes in cave fish are an example, and even
that isn't neutral drift because the loss of eye function has selective
advantage because of the energy consumption of tissue that is no longer
needed. Pigmentation loss in caves may be a better example, but
watching the nature shows and observing all the deep sea fish that
retain their eyes and bright colorations make me think twice about it.

Ron Okimoto
jillery
2025-01-04 11:44:30 UTC
Reply
Permalink
Post by RonO
Post by MarkE
Post by Ernest Major
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease
low local entropy: an energy capture and transformation mechanism is
also needed.
Extant life *maintains* low local entropy through its organisation
and processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that
does this. For example, neutral drift alone increases entropy.
It is difficult to operationalise the concept of irreducible
complexity, as that necessitates a principled definition of system,
part and function. But if you pass over that point, there are at least
three classes of paths (exaption, scaffolding, coevolution) whereby
irreducibly complex systems can evolve. I suspect that the last is the
most frequent, and that it can be driven by drift as well as by
selection. If you are equating an increase in functional complexity
and organisation with a decrease in entropy, then this would negate a
claim that neutral drift always increases entropy.
What I would say more confidently is, "For example, neutral drift alone
increases disorder."
More precisely, if a population fixes neutral and near-neutral mutations
over time through drift, with no selection acting, the net effect over
time will be devolution, i.e. a loss of information and functional
complexity. The end state will be extinction.
Does this necessarily mean entropy will increase? It would seem so.
It would be non neutral drift that would likely be associated with
increased disorder. Even if something is selected against it can still
be fixed in a small population by factors not directly affecting the
trait under selection. 90% of a population might be wiped out by some
disease and the survivors may just, by chance, have a high frequency of
some deleterious allele that might get fixed by random chance.
Neutral drift is just neutral changes, it shouldn't result in any
increase in disorder because those changes are selected against. Look
at the coelacanth it has been adapted to a specific environment for
hundreds of millions of years, and has changed very little, and yet
genetic drift in their DNA sequence and physical features invisible to
the environment have changed, but the outward physical appearance has
not degenerated. Drift has occurred even as the morphology has been
maintained. The skull may make nearly identical fossil impressions, but
when you look at the skull you observe that all the bones that make up
the skull have very different sizes and shapes, but still make the same
overall skull shape. This is neutral drift.
Drift can result in loss of gene functions that are no longer needed in
certain environments. The eyes in cave fish are an example, and even
that isn't neutral drift because the loss of eye function has selective
advantage because of the energy consumption of tissue that is no longer
needed. Pigmentation loss in caves may be a better example, but
watching the nature shows and observing all the deep sea fish that
retain their eyes and bright colorations make me think twice about it.
Ron Okimoto
AIUI eyes in lightless environments are disadvantageous, in part
becasue they're energy expensive as you say, but also in part because
they're easily damaged, and also in part because the brainpower needed
to make them work can be applied to other working senses. OTOH eyes
in the deep ocean are still useful, where many species create their
own light in order to find members of the opposite sex.
--
To know less than we don't know is the nature of most knowledge
a***@littlepinkcloud.invalid
2025-01-03 21:08:10 UTC
Reply
Permalink
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Not necessarily. A rotating planet around a star is bathed in
low-entropy photons during the day and radiates high-entropy photons
during the night.
Post by MarkE
Extant life *maintains* low local entropy through its organisation and
processes.
That's not it. Extant life *uses* a source of low entropy. The sun's
energy is high-frequency low-entropy (visible light), allowing life to
consume that and give off waste energy as heat, which is radiated away
as infra-red.

There's a nice quote from Discover magazine:

https://www.discovermagazine.com/the-sciences/evolution-and-the-second-law

"The energy we get from the Sun is of a low-entropy, useful form,
while the energy we radiate back out into space has a much higher
entropy. The temperature of the Sun is about twenty times the average
temperature of the Earth. The temperature of radiation is just the
average energy of the photons of which it is made, so the Earth needs
to radiate twenty low-energy (long-wavelength, infrared) photons for
every one high-energy (short-wavelength, visible) photon it receives.
It turns out, after a bit of math, that twenty times as many photons
directly translates into twenty times the entropy. The Earth emits the
same amount of energy as it receives, but with twenty times higher
entropy."

Andrew.
MarkE
2025-01-04 11:32:14 UTC
Reply
Permalink
Post by a***@littlepinkcloud.invalid
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Not necessarily. A rotating planet around a star is bathed in
low-entropy photons during the day and radiates high-entropy photons
during the night.
Post by MarkE
Extant life *maintains* low local entropy through its organisation and
processes.
That's not it. Extant life *uses* a source of low entropy. The sun's
energy is high-frequency low-entropy (visible light), allowing life to
consume that and give off waste energy as heat, which is radiated away
as infra-red.
https://www.discovermagazine.com/the-sciences/evolution-and-the-second-law
"The energy we get from the Sun is of a low-entropy, useful form,
while the energy we radiate back out into space has a much higher
entropy. The temperature of the Sun is about twenty times the average
temperature of the Earth. The temperature of radiation is just the
average energy of the photons of which it is made, so the Earth needs
to radiate twenty low-energy (long-wavelength, infrared) photons for
every one high-energy (short-wavelength, visible) photon it receives.
It turns out, after a bit of math, that twenty times as many photons
directly translates into twenty times the entropy. The Earth emits the
same amount of energy as it receives, but with twenty times higher
entropy."
Andrew.
No, isn't the energy we get from the Sun Gibbs free energy, G, where

G = H − TS

where H is the enthalpy, T is the absolute temperature, and S is the
entropy?

https://en.wikipedia.org/wiki/Thermodynamic_free_energy
a***@littlepinkcloud.invalid
2025-01-05 08:12:55 UTC
Reply
Permalink
Post by MarkE
Post by a***@littlepinkcloud.invalid
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Not necessarily. A rotating planet around a star is bathed in
low-entropy photons during the day and radiates high-entropy photons
during the night.
Post by MarkE
Extant life *maintains* low local entropy through its organisation and
processes.
That's not it. Extant life *uses* a source of low entropy. The sun's
energy is high-frequency low-entropy (visible light), allowing life to
consume that and give off waste energy as heat, which is radiated away
as infra-red.
https://www.discovermagazine.com/the-sciences/evolution-and-the-second-law
"The energy we get from the Sun is of a low-entropy, useful form,
while the energy we radiate back out into space has a much higher
entropy. The temperature of the Sun is about twenty times the average
temperature of the Earth. The temperature of radiation is just the
average energy of the photons of which it is made, so the Earth needs
to radiate twenty low-energy (long-wavelength, infrared) photons for
every one high-energy (short-wavelength, visible) photon it receives.
It turns out, after a bit of math, that twenty times as many photons
directly translates into twenty times the entropy. The Earth emits the
same amount of energy as it receives, but with twenty times higher
entropy."
No, isn't the energy we get from the Sun Gibbs free energy, G, where
G = H − TS
where H is the enthalpy, T is the absolute temperature, and S is the
entropy?
The useful energy is, yes. I was talking about how it got to be like
that.

But as long as you've got a plentiful supply of useful low-entropy
energy, the planet's heat distribution can be highly nonuniform. For
example, a valley might be consistently hotter than the average
temperature, as might the temperature at the poles be lower. Weather,
too, will produce non-uniform states. It doesn't require any kind of
energy capture and transformation mechanism.

Andrew.
LDagget
2025-01-05 09:08:08 UTC
Reply
Permalink
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
There's so much fuzzy language in there that replying is bound to
explode
things into an extended attempt to us clean up the language enough for
one to understand the claims in scientific language.

So let's bypass that, besides, it's been done before.

Instead, consider this. The broad objection is that creationists or
their
bedfellows try to claim that evolution is somehow a decrease in entropy.
Such a claim is superficially nonsensical. Here's why.

Evolution is a result. In a simple example of adaption, an initial
population of bacteria begin with an enzyme that is effective against
one antibiotic but has very low efficacy against a related antibiotic.
These things work such that at very low concentrations of the antibiotic
the low efficacy is sufficient to allow the bacteria to keep growing
but at higher concentrations bacterial cell wall growth is inhibited
and the bacteria stop growing.

so the usual processes take place because they can't be stopped.
Imperfect
DNA replication occurs, mutations of the antibiotic resistance enzyme
occur, most have little effect but occasionally there are mutations that
result in increased activity against the related antibiotic (a new
drug).
So now bacteria with the mutation can grow faster than those without
the mutation. The process repeats, the gene pool evolves to have more
and more of the antibiotic resistance gene with higher efficacy and
that's evolution.

Now where and how do people claim that is a violation of the 2nd law?

Let's help with that. The process involves millions of bacteria growing,
reproducing, and sometimes dying. The process is the sum of all of these
events. Each and every cell consumed food, and metabolized it. The sum
of their life processes can be cartooned like the metabolism of glucose
C6H12O6 + 6 O2 ==> 6 CO2 + 6 H2O This reaction increases entropy.
It still increases net entropy when coupled to charging ADP to ATP.
The sum of all of the chemical reactions that have to take place for a
cell to grow and replicate represents a relentless increase in entropy
comparing the reactants to the products.

Now how do the creationists claim that summing up all of these millions
of positive increases in entropy represents a decrease in entropy?

That's the essence of their nonsensical claim. At each and every step
along the way, all the chemical reactions are increasing entropy.
Entropy increases when DNA is replicated. There's negligible difference
between a perfect copy of a gene and a copy with a mutation. Either
way, it's an increase in entropy.

The only way that evolution can be considered a decrease in entropy is
thus revealed to be by a failure to look at the actual processes
involved
and to instead resort to hand waving about loosey-goosey attempts to
invoke related concepts like __disorder__.
MarkE
2025-01-07 11:58:24 UTC
Reply
Permalink
Post by LDagget
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
There's so much fuzzy language in there that replying is bound to
explode
things into an extended attempt to us clean up the language enough  for
one to understand the claims in scientific language.
So let's bypass that, besides, it's been done before.
Instead, consider this. The broad objection is that creationists or
their
bedfellows try to claim that evolution is somehow a decrease in entropy.
Such a claim is superficially nonsensical. Here's why.
Evolution is a result. In a simple example of adaption, an initial
population of bacteria begin with an enzyme that is effective against
one antibiotic but has very low efficacy against a related antibiotic.
These things work such that at very low concentrations of the antibiotic
the low efficacy is sufficient to allow the bacteria to keep growing
but at higher concentrations bacterial cell wall growth is inhibited
and the bacteria stop growing.
so the usual processes take place because they can't be stopped.
Imperfect
DNA replication occurs, mutations of the antibiotic resistance enzyme
occur, most have little effect but occasionally there are mutations that
result in increased activity against the related antibiotic (a new
drug).
So now bacteria with the mutation can grow faster than those without
the mutation. The process repeats, the gene pool evolves to have more
and more of the antibiotic resistance gene with higher efficacy and
that's evolution.
Now where and how do people claim that is a violation of the 2nd law?
Let's help with that. The process involves millions of bacteria growing,
reproducing, and sometimes dying. The process is the sum of all of these
events. Each and every cell consumed food, and metabolized it. The sum
of their life processes can be cartooned like the metabolism of glucose
C6H12O6 + 6 O2 ==> 6 CO2 + 6 H2O    This reaction increases entropy.
It still increases net entropy when coupled to charging ADP to ATP.
The sum of all of the chemical reactions that have to take place for a
cell to grow and replicate represents a relentless increase in entropy
comparing the reactants to the products.
Now how do the creationists claim that summing up all of these millions
of positive increases in entropy represents a decrease in entropy?
That's the essence of their nonsensical claim. At each and every step
along the way, all the chemical reactions are increasing entropy.
Entropy increases when DNA is replicated. There's negligible difference
between a perfect copy of a gene and a copy with a mutation. Either
way, it's an increase in entropy.
The only way that evolution can be considered a decrease in entropy is
thus revealed to be by a failure to look at the actual processes
involved
and to instead resort to hand waving about loosey-goosey attempts to
invoke related concepts like __disorder__.
Entropy is a difficult concept, especially its application to OoL and
evolution. I get the frustration. My own understanding in incomplete.
The examples you give point to a question of where exactly is the
"local" boundary of the claimed entropy decrease?

What about this approach as starting point. Take configuration entropy,
which relates to the arrangements or positions of components within a
system, but ignores energy distributions, i.e. it's a subset of
thermodynamic entropy, and can be calculated for systems where the
spatial distribution dominates the behavior. Configuration entropy can
be calculated using the Boltzmann entropy equation:

𝑆 = 𝑘𝐵.ln(𝑊)

where:
𝑆 is Entropy
𝑘𝐵 is the Boltzmann constant
𝑊 is the Number of possible microstates (configurations)
consistent with the macrostate

Microstate: A specific arrangement of the system's components.
Macrostate: The overall state of the system, described by observable
properties like temperature, pressure, or composition.

A macrostate with a larger number of possible microstates has higher
configuration entropy." (ChatGPT-4o; also
https://en.wikipedia.org/wiki/Configuration_entropy)

For an ensemble of molecules that could form a living entity, the
"nonliving macrostate" is composed of many more possible microstates
(Wn) than the "living macrostate" (Wl). E.g., if say Wn/Wl = 10^10, then
the local configuration entropy *decrease* to go from nonliving to living is

Sn - Sl = kB(ln(Wn) - ln(Wl))
= kB.ln(Wn/Wl)
= kB.ln(10^10)
= 3.2E-22 joules per kelvin

Now, assuming that a similar result would apply for the other component
of thermodynamic entropy (i.e. energy distributions), then a protocell
or living cell represents a local region of greatly reduced entropy. The
process of abiogenesis has given rise to a region of reduced entropy
(albeit at the expense of an overall entropy increase in the environment).

And...so what, that's surely a given. You wouldn't dispute that, would you?

Therefore, the issue is, are the mechanisms available to say OoL capable
of driving this required entropy reduction? There does not appear to be
simple yes/no answer. Are redox couples, proton gradient, etc
sufficient, and in what form?

"In 1924 Alexander Oparin suggested that sufficient energy for
generating early life forms from non-living molecules was provided in a
"primordial soup".[31] The laws of thermodynamics impose some
constraints on the earliest life-sustaining reactions that would have
emerged and evolved from such a mixture. Essentially, to remain
consistent with the second law of thermodynamics, self organizing
systems that are characterized by lower entropy values than equilibrium
must dissipate energy so as to increase entropy in the external
environment.[32] One consequence of this is that low entropy or high
chemical potential chemical intermediates cannot build up to very high
levels if the reaction leading to their formation is not coupled to
another chemical reaction that releases energy. These reactions often
take the form of redox couples, which must have been provided by the
environment at the time of the origin of life.[33] In today's biology,
many of these reactions require catalysts (or enzymes) to proceed, which
frequently contain transition metals. This means identifying both redox
couples and metals that are readily available in a given candidate
environment for abiogenesis is an important aspect of prebiotic chemistry.

"The idea that processes that can occur naturally in the environment and
act to locally decrease entropy must be identified has been applied in
examinations of phosphate's role in the origin of life, where the
relevant setting for abiogenesis is an early Earth lake environment. One
such process is the ability of phosphate to concentrate reactants
selectively due to its localized negative charge.[34]

"In the context of the alkaline hydrothermal vent (AHV) hypothesis for
the origin of life, a framing of lifeforms as "entropy generators" has
been suggested in an attempt to develop a framework for abiogenesis
under alkaline deep sea conditions. Assuming life develops rapidly under
certain conditions, experiments may be able to recreate the first
metabolic pathway, as it would be the most energetically favorable and
therefore likely to occur. In this case, iron sulfide compounds may have
acted as the first catalysts.[35] Therefore, within the larger framing
of life as free energy converters, it would eventually be beneficial to
characterize quantities such as entropy production and proton gradient
dissipation rates quantitatively for origin of life relevant systems
(particularly AHVs)."

https://en.wikipedia.org/wiki/Entropy_and_life

As far as I understand then, it does seem to be an oversimplification to
say the development of life (origin and evolution) violates the second
law. On the other hand, a supply of free energy is not in and of itself
sufficient to explain life.

However, the ratio of the nonliving macrostate's number of possible
microstates (Wn) to the nonliving macrostate's (Wl) I suspect is grossly
underestimated, and the corresponding difficulty/improbability of matter
self-organising into life. A research avenue would be to further
quantify and analysise this.
LDagget
2025-01-07 17:07:53 UTC
Reply
Permalink
Post by MarkE
Post by LDagget
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
There's so much fuzzy language in there that replying is bound to
explode
things into an extended attempt to us clean up the language enough  for
one to understand the claims in scientific language.
So let's bypass that, besides, it's been done before.
Instead, consider this. The broad objection is that creationists or
their
bedfellows try to claim that evolution is somehow a decrease in entropy.
Such a claim is superficially nonsensical. Here's why.
Evolution is a result. In a simple example of adaption, an initial
population of bacteria begin with an enzyme that is effective against
one antibiotic but has very low efficacy against a related antibiotic.
These things work such that at very low concentrations of the antibiotic
the low efficacy is sufficient to allow the bacteria to keep growing
but at higher concentrations bacterial cell wall growth is inhibited
and the bacteria stop growing.
so the usual processes take place because they can't be stopped.
Imperfect
DNA replication occurs, mutations of the antibiotic resistance enzyme
occur, most have little effect but occasionally there are mutations that
result in increased activity against the related antibiotic (a new
drug).
So now bacteria with the mutation can grow faster than those without
the mutation. The process repeats, the gene pool evolves to have more
and more of the antibiotic resistance gene with higher efficacy and
that's evolution.
Now where and how do people claim that is a violation of the 2nd law?
Let's help with that. The process involves millions of bacteria growing,
reproducing, and sometimes dying. The process is the sum of all of these
events. Each and every cell consumed food, and metabolized it. The sum
of their life processes can be cartooned like the metabolism of glucose
C6H12O6 + 6 O2 ==> 6 CO2 + 6 H2O    This reaction increases entropy.
It still increases net entropy when coupled to charging ADP to ATP.
The sum of all of the chemical reactions that have to take place for a
cell to grow and replicate represents a relentless increase in entropy
comparing the reactants to the products.
Now how do the creationists claim that summing up all of these millions
of positive increases in entropy represents a decrease in entropy?
That's the essence of their nonsensical claim. At each and every step
along the way, all the chemical reactions are increasing entropy.
Entropy increases when DNA is replicated. There's negligible difference
between a perfect copy of a gene and a copy with a mutation. Either
way, it's an increase in entropy.
The only way that evolution can be considered a decrease in entropy is
thus revealed to be by a failure to look at the actual processes
involved
and to instead resort to hand waving about loosey-goosey attempts to
invoke related concepts like __disorder__.
Entropy is a difficult concept, especially its application to OoL and
evolution. I get the frustration. My own understanding in incomplete.
The examples you give point to a question of where exactly is the
"local" boundary of the claimed entropy decrease?
It doesn't need to be difficult. It's possible to make it seem
difficult, and you work hard at that below.
Post by MarkE
What about this approach as starting point. Take configuration entropy,
which relates to the arrangements or positions of components within a
system, but ignores energy distributions, i.e. it's a subset of
thermodynamic entropy, and can be calculated for systems where the
spatial distribution dominates the behavior. Configuration entropy can
It has already been established in my example that the process of
evolution produces a large increase in entropy. So you attempt a
"let's ignore what we know and try to wave our hands and pretend
otherwise." Seriously, that's what you're doing.
Post by MarkE
𝑆 = 𝑘𝐵.ln(𝑊)
𝑆 is Entropy
𝑘𝐵 is the Boltzmann constant
𝑊 is the Number of possible microstates (configurations)
consistent with the macrostate
Microstate: A specific arrangement of the system's components.
Macrostate: The overall state of the system, described by observable
properties like temperature, pressure, or composition.
A macrostate with a larger number of possible microstates has higher
configuration entropy." (ChatGPT-4o; also
https://en.wikipedia.org/wiki/Configuration_entropy)
For an ensemble of molecules that could form a living entity, the
"nonliving macrostate" is composed of many more possible microstates
(Wn) than the "living macrostate" (Wl). E.g., if say Wn/Wl = 10^10, then
the local configuration entropy *decrease* to go from nonliving to living is
Sn - Sl = kB(ln(Wn) - ln(Wl))
= kB.ln(Wn/Wl)
= kB.ln(10^10)
= 3.2E-22 joules per kelvin
Thus we get an AI regurgitating things it doens't understand. I rather
doubt you understand all this either or you wouldn't be wasting yours
and other people's time like this.
Post by MarkE
Now, assuming that a similar result would apply for the other component
of thermodynamic entropy (i.e. energy distributions), then a protocell
or living cell represents a local region of greatly reduced entropy. The
process of abiogenesis has given rise to a region of reduced entropy
(albeit at the expense of an overall entropy increase in the
environment).
And...so what, that's surely a given. You wouldn't dispute that, would you?
I dispute that it's remotely relevant.
Here's a simple illustration. You know about the fact that complementary
DNA strands fold up into the triple helix. It's well known that paired
strands of DNA can be melted apart at higher temps but that they will
re-anneal when the temp is dropped, or if complementary pairs are
re-introduced to each other below the melting temperature.

It should be obvious that two independent strands have vastly higher
conformational entropy than the two of them paired up in a matched
double helix. And yet --- the folding is spontaneous and represents a
decrease in entropy. The conformational entropy may seem hugely
decreased
to you in concept. And if you run a statistical mechanic style
calculation
and estimate a W (as above) for the number of states available for
different conformations you can model the effect on just the DNA
strands.
The commensurate change in the solvent, in the other direction, happens
to overwhelm that of the DNA strands. Broadly speaking, this is the
hydrophobic effect. Understanding it is essential to understanding
biochem.

The point? Babble away all you want about conformational entropy in DNA
strands, the process of forming the mated double helix is a spontaneous
process that increases entropy. There's no violation of the 2nd law.
Introducing some discussion about the conformational entropy of just the
DNA strands that ignores the entropy of the solvent interacting with the
DNA is a way to hide the truth of what happens in solution.

Don't do that. It's not honest. You were already shown that evolution is
the result of the addition of many increases in entropy. Retorting with
"but but but, whatabout ...."
is nonsensical deflection. Don't do that. Whatabout-isms are no better
in science than they are in politics.


But you won't let it go, will you? Very well then.
We can compute, using W as with the above equations, the difference in
entropy for DNA synthesis that synthesizes a specific product strand vs.
a completely random strand. It's a large change in W when instead of
just
one base added at each position one can add any of 4. In fact it becomes
4^N where N is the length of the polymer synthesized. So we can plug in
the comparison of 1^N (also known as 1) to 4^N.

And what happens? It's minuscule compared to the entropy change from the
hydrolysis of the reactant nucleotide triphosphates.

An AI is dumb enough to be able to tell you how to calculate the entropy
different between synthesizing a random DNA strand of length N and a
specific DNA strand of length N. But it doesn't know enough to tell
you
that it was, in fact, a stupid question if you're worried if the process
is problematic respective to the 2nd law of thermodynamics. It doesn't
understand.

Try to be better and understand.
Post by MarkE
Therefore, the issue is, are the mechanisms available to say OoL capable
of driving this required entropy reduction? There does not appear to be
simple yes/no answer. Are redox couples, proton gradient, etc
sufficient, and in what form?
"In 1924 Alexander Oparin suggested that sufficient energy for
generating early life forms from non-living molecules was provided in a
"primordial soup".[31] The laws of thermodynamics impose some
constraints on the earliest life-sustaining reactions that would have
emerged and evolved from such a mixture. Essentially, to remain
consistent with the second law of thermodynamics, self organizing
systems that are characterized by lower entropy values than equilibrium
must dissipate energy so as to increase entropy in the external
environment.[32] One consequence of this is that low entropy or high
chemical potential chemical intermediates cannot build up to very high
levels if the reaction leading to their formation is not coupled to
another chemical reaction that releases energy. These reactions often
take the form of redox couples, which must have been provided by the
environment at the time of the origin of life.[33] In today's biology,
many of these reactions require catalysts (or enzymes) to proceed, which
frequently contain transition metals. This means identifying both redox
couples and metals that are readily available in a given candidate
environment for abiogenesis is an important aspect of prebiotic chemistry.
"The idea that processes that can occur naturally in the environment and
act to locally decrease entropy must be identified has been applied in
examinations of phosphate's role in the origin of life, where the
relevant setting for abiogenesis is an early Earth lake environment. One
such process is the ability of phosphate to concentrate reactants
selectively due to its localized negative charge.[34]
"In the context of the alkaline hydrothermal vent (AHV) hypothesis for
the origin of life, a framing of lifeforms as "entropy generators" has
been suggested in an attempt to develop a framework for abiogenesis
under alkaline deep sea conditions. Assuming life develops rapidly under
certain conditions, experiments may be able to recreate the first
metabolic pathway, as it would be the most energetically favorable and
therefore likely to occur. In this case, iron sulfide compounds may have
acted as the first catalysts.[35] Therefore, within the larger framing
of life as free energy converters, it would eventually be beneficial to
characterize quantities such as entropy production and proton gradient
dissipation rates quantitatively for origin of life relevant systems
(particularly AHVs)."
https://en.wikipedia.org/wiki/Entropy_and_life
As far as I understand then, it does seem to be an oversimplification to
say the development of life (origin and evolution) violates the second
law. On the other hand, a supply of free energy is not in and of itself
sufficient to explain life.
However, the ratio of the nonliving macrostate's number of possible
microstates (Wn) to the nonliving macrostate's (Wl) I suspect is grossly
underestimated, and the corresponding difficulty/improbability of matter
self-organising into life. A research avenue would be to further
quantify and analysise this.
MarkE
2025-01-10 12:18:44 UTC
Reply
Permalink
Post by LDagget
Post by MarkE
Post by LDagget
Post by MarkE
Are these statements correct? Could they be better expressed?
Local entropy can decrease in an open system with an input of free energy.
Free energy alone is not sufficient to maintain or further decrease low
local entropy: an energy capture and transformation mechanism is also
needed.
Extant life *maintains* low local entropy through its organisation and
processes.
Evolving life *decreases* low local entropy through the ratcheting
mechanism natural selection acting on random mutations in instances
where that evolution increases functional complexity and organisation.
There is no other known mechanism apart from natural selection that does
this. For example, neutral drift alone increases entropy.
There's so much fuzzy language in there that replying is bound to
explode
things into an extended attempt to us clean up the language enough  for
one to understand the claims in scientific language.
So let's bypass that, besides, it's been done before.
Instead, consider this. The broad objection is that creationists or
their
bedfellows try to claim that evolution is somehow a decrease in entropy.
Such a claim is superficially nonsensical. Here's why.
Evolution is a result. In a simple example of adaption, an initial
population of bacteria begin with an enzyme that is effective against
one antibiotic but has very low efficacy against a related antibiotic.
These things work such that at very low concentrations of the antibiotic
the low efficacy is sufficient to allow the bacteria to keep growing
but at higher concentrations bacterial cell wall growth is inhibited
and the bacteria stop growing.
so the usual processes take place because they can't be stopped.
Imperfect
DNA replication occurs, mutations of the antibiotic resistance enzyme
occur, most have little effect but occasionally there are mutations that
result in increased activity against the related antibiotic (a new
drug).
So now bacteria with the mutation can grow faster than those without
the mutation. The process repeats, the gene pool evolves to have more
and more of the antibiotic resistance gene with higher efficacy and
that's evolution.
Now where and how do people claim that is a violation of the 2nd law?
Let's help with that. The process involves millions of bacteria growing,
reproducing, and sometimes dying. The process is the sum of all of these
events. Each and every cell consumed food, and metabolized it. The sum
of their life processes can be cartooned like the metabolism of glucose
C6H12O6 + 6 O2 ==> 6 CO2 + 6 H2O    This reaction increases entropy.
It still increases net entropy when coupled to charging ADP to ATP.
The sum of all of the chemical reactions that have to take place for a
cell to grow and replicate represents a relentless increase in entropy
comparing the reactants to the products.
Now how do the creationists claim that summing up all of these millions
of positive increases in entropy represents a decrease in entropy?
That's the essence of their nonsensical claim. At each and every step
along the way, all the chemical reactions are increasing entropy.
Entropy increases when DNA is replicated. There's negligible difference
between a perfect copy of a gene and a copy with a mutation. Either
way, it's an increase in entropy.
The only way that evolution can be considered a decrease in entropy is
thus revealed to be by a failure to look at the actual processes
involved
and to instead resort to hand waving about loosey-goosey attempts to
invoke related concepts like __disorder__.
Entropy is a difficult concept, especially its application to OoL and
evolution. I get the frustration. My own understanding in incomplete.
The examples you give point to a question of where exactly is the
"local" boundary of the claimed entropy decrease?
It doesn't need to be difficult. It's possible to make it seem
difficult, and you work hard at that below.
Post by MarkE
What about this approach as starting point. Take configuration entropy,
which relates to the arrangements or positions of components within a
system, but ignores energy distributions, i.e. it's a subset of
thermodynamic entropy, and can be calculated for systems where the
spatial distribution dominates the behavior. Configuration entropy can
It has already been established in my example that the process of
evolution produces a large increase in entropy. So you attempt a
"let's ignore what we know and try to wave our hands and pretend
otherwise."  Seriously, that's what you're doing.
Post by MarkE
     𝑆 = 𝑘𝐵.ln(𝑊)
     𝑆 is Entropy
     𝑘𝐵 is the Boltzmann constant
     𝑊 is the Number of possible microstates (configurations)
        consistent with the macrostate
Microstate: A specific arrangement of the system's components.
Macrostate: The overall state of the system, described by observable
properties like temperature, pressure, or composition.
A macrostate with a larger number of possible microstates has higher
configuration entropy." (ChatGPT-4o; also
https://en.wikipedia.org/wiki/Configuration_entropy)
For an ensemble of molecules that could form a living entity, the
"nonliving macrostate" is composed of many more possible microstates
(Wn) than the "living macrostate" (Wl). E.g., if say Wn/Wl = 10^10, then
the local configuration entropy *decrease* to go from nonliving to living is
     Sn - Sl = kB(ln(Wn) - ln(Wl))
             = kB.ln(Wn/Wl)
             = kB.ln(10^10)
             = 3.2E-22 joules per kelvin
Thus we get an AI regurgitating things it doens't understand. I rather
doubt you understand all this either or you wouldn't be wasting yours
and other people's time like this.
Post by MarkE
Now, assuming that a similar result would apply for the other component
of thermodynamic entropy (i.e. energy distributions), then a protocell
or living cell represents a local region of greatly reduced entropy. The
process of abiogenesis has given rise to a region of reduced entropy
(albeit at the expense of an overall entropy increase in the
environment).
And...so what, that's surely a given. You wouldn't dispute that, would you?
I dispute that it's remotely relevant.
Here's a simple illustration. You know about the fact that complementary
DNA strands fold up into the triple helix. It's well known that paired
strands of DNA can be melted apart at higher temps but that they will
re-anneal when the temp is dropped, or if complementary pairs are
re-introduced to each other below the melting temperature.
It should be obvious that two independent strands have vastly higher
conformational entropy than the two of them paired up in a matched
double helix. And yet --- the folding is spontaneous and represents a
decrease in entropy. The conformational entropy may seem hugely
decreased
to you in concept. And if you run a statistical mechanic style
calculation
and estimate a W (as above) for the number of states available for
different conformations you can model the effect on just the DNA
strands.
The commensurate change in the solvent, in the other direction, happens
to overwhelm that of  the DNA strands. Broadly speaking, this is the
hydrophobic effect. Understanding it is essential to understanding
biochem.
The point? Babble away all you want about conformational entropy in DNA
strands, the process of forming the mated double helix is a spontaneous
process that increases entropy. There's no violation of the 2nd law.
Introducing some discussion about the conformational entropy of just the
DNA strands that ignores the entropy of the solvent interacting with the
DNA is a way to hide the truth of what happens in solution.
Don't do that. It's not honest. You were already shown that evolution is
the result of the addition of many increases in entropy. Retorting with
    "but but but, whatabout ...."
is nonsensical deflection. Don't do that. Whatabout-isms are no better
in science than they are in politics.
But you won't let it go, will you? Very well then.
We can compute, using W as with the above equations, the difference in
entropy for DNA synthesis that synthesizes a specific product strand vs.
a completely random strand. It's a large change in W when instead of
just
one base added at each position one can add any of 4. In fact it becomes
4^N where N is the length of the polymer synthesized. So we can plug in
the comparison of 1^N (also known as 1) to 4^N.
And what happens? It's minuscule compared to the entropy change from the
hydrolysis of the reactant nucleotide triphosphates.
An AI is dumb enough to be able to tell you how to calculate the entropy
different between synthesizing a random DNA strand of length N and a
specific DNA strand of length N.   But it doesn't know enough to tell
you
that it was, in fact, a stupid question if you're worried if the process
is problematic respective to the 2nd law of thermodynamics. It doesn't
understand.
Try to be better and understand.
As I noted previously, where one draws the system boundary is critical
in this analysis. I agree that the contribution of energy released
during polymerization needs to be considered as well.

However, let's define our initial local system not as a set of activated
nucleotide monomers but inactivated versions. In this case, energy must
first flow into the system from the surroundings for activation, and
then flow out again during polymerization. I'm not sure what the net
effect of this is precisely, but clearly a much small net change to the
system, if not close to zero. This then allows configuration entropy to
be significant.
Post by LDagget
Post by MarkE
Therefore, the issue is, are the mechanisms available to say OoL capable
of driving this required entropy reduction? There does not appear to be
simple yes/no answer. Are redox couples, proton gradient, etc
sufficient, and in what form?
"In 1924 Alexander Oparin suggested that sufficient energy for
generating early life forms from non-living molecules was provided in a
"primordial soup".[31] The laws of thermodynamics impose some
constraints on the earliest life-sustaining reactions that would have
emerged and evolved from such a mixture. Essentially, to remain
consistent with the second law of thermodynamics, self organizing
systems that are characterized by lower entropy values than equilibrium
must dissipate energy so as to increase entropy in the external
environment.[32] One consequence of this is that low entropy or high
chemical potential chemical intermediates cannot build up to very high
levels if the reaction leading to their formation is not coupled to
another chemical reaction that releases energy. These reactions often
take the form of redox couples, which must have been provided by the
environment at the time of the origin of life.[33] In today's biology,
many of these reactions require catalysts (or enzymes) to proceed, which
frequently contain transition metals. This means identifying both redox
couples and metals that are readily available in a given candidate
environment for abiogenesis is an important aspect of prebiotic chemistry.
"The idea that processes that can occur naturally in the environment and
act to locally decrease entropy must be identified has been applied in
examinations of phosphate's role in the origin of life, where the
relevant setting for abiogenesis is an early Earth lake environment. One
such process is the ability of phosphate to concentrate reactants
selectively due to its localized negative charge.[34]
"In the context of the alkaline hydrothermal vent (AHV) hypothesis for
the origin of life, a framing of lifeforms as "entropy generators" has
been suggested in an attempt to develop a framework for abiogenesis
under alkaline deep sea conditions. Assuming life develops rapidly under
certain conditions, experiments may be able to recreate the first
metabolic pathway, as it would be the most energetically favorable and
therefore likely to occur. In this case, iron sulfide compounds may have
acted as the first catalysts.[35] Therefore, within the larger framing
of life as free energy converters, it would eventually be beneficial to
characterize quantities such as entropy production and proton gradient
dissipation rates quantitatively for origin of life relevant systems
(particularly AHVs)."
https://en.wikipedia.org/wiki/Entropy_and_life
As far as I understand then, it does seem to be an oversimplification to
say the development of life (origin and evolution) violates the second
law. On the other hand, a supply of free energy is not in and of itself
sufficient to explain life.
However, the ratio of the nonliving macrostate's number of possible
microstates (Wn) to the nonliving macrostate's (Wl) I suspect is grossly
underestimated, and the corresponding difficulty/improbability of matter
self-organising into life. A research avenue would be to further
quantify and analysise this.
Ernest Major
2025-01-10 17:23:53 UTC
Reply
Permalink
Post by MarkE
As I noted previously, where one draws the system boundary is critical
in this analysis. I agree that the contribution of energy released
during polymerization needs to be considered as well.
If the result depends on where you draw the boundary would that not be
an indication that there's something wrong with the analysis.
Post by MarkE
However, let's define our initial local system not as a set of activated
nucleotide monomers but inactivated versions. In this case, energy must
first flow into the system from the surroundings for activation, and
then flow out again during polymerization. I'm not sure what the net
effect of this is precisely, but clearly a much small net change to the
system, if not close to zero. This then allows configuration entropy to
be significant.
--
alias Ernest Major
LDagget
2025-01-10 18:42:14 UTC
Reply
Permalink
Post by Ernest Major
Post by MarkE
As I noted previously, where one draws the system boundary is critical
in this analysis. I agree that the contribution of energy released
during polymerization needs to be considered as well.
If the result depends on where you draw the boundary would that not be
an indication that there's something wrong with the analysis.
Yes.
Bob Casanova
2025-01-11 05:48:16 UTC
Reply
Permalink
On Fri, 10 Jan 2025 18:42:14 +0000, the following appeared
Post by Ernest Major
Post by MarkE
As I noted previously, where one draws the system boundary is critical
in this analysis. I agree that the contribution of energy released
during polymerization needs to be considered as well.
If the result depends on where you draw the boundary would that not be
an indication that there's something wrong with the analysis.
Yes.
Also, IIRC from my undergrad days, the 2nd Law applies to
all systems. It simply requires that *all* energy exchanges
be included.
--
Bob C.

"The most exciting phrase to hear in science,
the one that heralds new discoveries, is not
'Eureka!' but 'That's funny...'"

- Isaac Asimov
Loading...