entropy

definitions.. too verbose for my mind.. but this got me:

What is Entropy? – via the science asylum – Nick Lucid

seems like any weird thing we’d measure (energy/momentum)..

1800s – holes in understanding of heat/temp… wasn’t called heat.. called it caloric.. magical fluid that flowed between objects.. no knowledge of atoms

1865 – clausius – inside transformation… and he made it sound like energy on purpose.. then wrote 2nd law of thermo – energy always increases..

1900s – started to understand atoms.. and started modeling them.. started to understand entropy… nothing extra.. nothing missing…  molecules don’t possess entropy but container of water does

entropy is an emergent property – property when whole is larger than sum of parts…..

entropy emerges when enough molecules in place for statistics to be important

3 min – (old school defn) entropy measures the disorder of the energy of a collection of particles

word disorder can be very subjective…

messy room analogy – items represent the energy ..not the particles … org room – low entropy… messy rooom… high entropy.. are you sure.. depends on who lives in room… organized chaos.. if know where everything is… not disordered because it’s all over (the place) it’s disordered because the universe itself has lost track of the energy’s origins…

4 min – it’s like the universe has a witness protection program for its energy and we call that entropy…

whoa. huge..

 

perhaps like.. idiosyncratic jargon as protection – and gershenfeld something else as protection..  ps in the open et al..

and how Greg writes about dna junk here.. who’s to say what’s junk..

@Digitaltonto
3/14/16 5:58 AM
To Adapt, You Need To Evolve – digitaltonto.com/2016/to-adapt-…

our junk DNA is just a few random mutations away from becoming something important.

_______________

2 min – aha with the pool game..

and that’s what we want.. what we’ve always wanted.. just haven’t had the means (ie: mech) to facilitate the chaos of unlimited choices… ie: to live out the rev of everyday life..

3 min – 2nd law of thermo.. entropy stays same or increases.. good ie: life..

hot liquid has higher entropy.. hot water molecules have more movement than cold.. therefore more dis ordered.. ice.. less entropy.. constricted movement...

whoa.. hot more movement..

4 min – only way to lower entropy – releasing heat into room thereby increasing total entropy of surrounding environment..

5 min – why live in universe that continues dis order.. many more high entropy (dis order) states available than low entropy

as much as we try to sustain order.. bound to 2nd law of thermo.

_______________

via Hank:

Entropy: Embrace the Chaos! Crash Course Chemistry

infinite other ways…

2nd law: any spontaneous process increases the disorder of the universe

assumed disorder.. no?

processes that don’t increase the disorder.. require work to be done in opposition to the disorder.. in fact often impossible to achieve.. fact of putting one system into order requires others to be dis ordered..

? again.. who’s to say dis order – rather than.. just not my order

2 min – entropy: measure of molecular randomness.. dis order.. helps make chemical reactions possible and helps us predict how much work can be extracted from reactions..

3 min – spontaneous: doesn’t need outside energy – not how quick it happens –

entropy – doesn’t depend on path it took to get to its state

9 min – enthalpy (heat) enthropy (disorder)

12 min – change in entropy depends on how much room molecules have to move around in

_______________

adding page because of a lot of things.. but actual day adding it.. because of these tweets:

@asennenov
3/14/16 5:52 AM
“Freeing his time for its more effective exploratory investment is to give man increased wealth.” —R.B.Fuller
“Wealth is anti-entropy at a most exquisite degree of concentration” —R.B.Fuller #degrowth

bucky ness

so.. is wealth anti entropy.. or entropy.. at a ginormous degree…?

then Smári‘s – 4 tweets prior to catching awenneov’s

Smári McCarthy (@smarimc) tweeted at 5:46 AM on Mon, Mar 14, 2016:
I’m fascinated by the way people work with text. Lots of people apply neither rhyme nor reason. Random blocks of text, zero structure, …
(https://twitter.com/smarimc/status/709344905328521217?s=03)

________

entropy

degree of disorder.. lack of order.. who’s deciding this..? ie: is it disorder.. or just that we don’t see/get/grok the structure/order

intro to entropy:

wikipedia small

The idea of “irreversibility” is central to the understanding of entropy. Everyone has an intuitive understanding of irreversibility (a dissipative process) – if one watches a movie of everyday life running forward and in reverse, it is easy to distinguish between the two. The movie running in reverse shows impossible things happening – water jumping out of a glass into a pitcher above it, smoke going down a chimney, water in a glass freezing to form ice cubes, crashed cars reassembling themselves, and so on. The intuitive meaning of expressions such as “you can’t unscramble an egg”, “don’t cry over spilled milk” or “you can’t take the cream out of the coffee” is that these are irreversible processes. There is a direction in time by which spilled milk does not go back into the glass.

In thermodynamics, one says that the “forward” processes – pouring water from a pitcher, smoke going up a chimney, etc. – are “irreversible”: they cannot happen in reverse, even though, on a microscopic level, no laws of physics would be violated if they did. All real physical processes involving systems in everyday life, with many atoms or molecules, are irreversible. For an irreversible process in an isolated system, the thermodynamic state variable known as entropy is always increasing. The reason that the movie in reverse is so easily recognized is because it shows processes for which entropy is decreasing, which is physically impossible. In everyday life, there may be processes in which the increase of entropy is practically unobservable, almost zero. In these cases, a movie of the process run in reverse will not seem unlikely. For example, in a 1-second video of the collision of two billiard balls, it will be hard to distinguish the forward and the backward case, because the increase of entropy during that time is relatively small. In thermodynamics, one says that this process is practically “reversible”, with an entropy increase that is practically zero. The statement of the fact that the entropy of the Universe never decreases is found in the second law of thermodynamics.

In a physical system, entropy provides a measure of the amount of thermal energy that cannot be used to do work. In some other definitions of entropy, it is a measure of how evenly energy (or some analogous property) is distributed in a system. Work and heat are determined by a process that a system undergoes, and only occur at the boundary of a system. Entropy is a function of the state of a system, and has a value determined by the state variables of the system.

The concept of entropy is central to the second law of thermodynamics. The second law determines which physical processes can occur. For example, it predicts that the flow of heat from a region of high temperature to a region of low temperature is a spontaneous process – it can proceed along by itself without needing any extra external energy. When this process occurs, the hot region becomes cooler and the cold region becomes warmer. Heat is distributed more evenly throughout the system and the system’s ability to do work has decreased because the temperature difference between the hot region and the cold region has decreased. Referring back to our definition of entropy, we can see that the entropy of this system has increased. Thus, the second law of thermodynamics can be stated to say that the entropy of an isolated system always increases, and such processes which increase entropy can occur spontaneously. The entropy of a system increases as its components have the range of their momentum and/or position increased.

The term entropy was coined in 1865 by the German physicist Rudolf Clausius, from the Greek words en-, “in”, and trope “a turning”, in analogy with energy.

[..]

Traditionally, 20th century textbooks have introduced entropy as order and disorder so that it provides “a measurement of the disorder or randomness of a system“. It has been argued that ambiguities in the terms used (such as “disorder” and “chaos”) contribute to widespread confusion and can hinder comprehension of entropy for most students. A more recent formulation associated with Frank L. Lambert describing entropy as energy dispersal.

[..]

Explanation

The concept of thermodynamic entropy arises from the second law of thermodynamics. By this law of entropy increase it quantifies the reduction in the capacity of a system for change, for example heat always flows from a region of higher temperature to one with lower temperature until temperature becomes uniform or determines whether a thermodynamic process may occur.

[..]

Example of increasing entropy

Main article: Disgregation

Ice melting provides an example in which entropy increases in a small system, a thermodynamic system consisting of the surroundings (the warm room) and the entity of glass container, ice, water which has been allowed to reach thermodynamic equilibrium at the melting temperature of ice.

[..]

It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: …… This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.

As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the δQ/Tover the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice and water was introduced and became a ‘system’ within it.

Origins and uses

Originally, entropy was named to describe the “waste heat,” or more accurately, energy loss, from heat engines and other mechanical devices which ..

could never run with 100% efficiency in converting energy into work.

waste heat

energy\ness

Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level. In the late 19th century, the word “disorder” was used by Ludwig Boltzmann in developing statistical views of entropy using probability theory to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by Werner Heisenberg and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and statistical mechanics.

For most of the 20th century, textbooks tended to describe entropy as “disorder”, following Boltzmann’s early conceptualisation of the motional” (i.e. kinetic) energy of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe entropy as energy dispersal. Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.

entropy:

wikipedia small

In thermodynamics, entropy (usual symbol S) is a measure of the number of specific realizations or microstates that may realize a thermodynamic system in a defined state specified by macroscopic variables. Most understand entropy as a measure of molecular disorder within a macroscopic system. The second law of thermodynamics states that an isolated system’s entropy never decreases. Such a system spontaneously evolves towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided they increase their environment’s entropy by that increment. Since entropy is a state function, the change in entropy of a system is constant for any process with known initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.

The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process as

\Delta S = \int \frac{\delta Q_\text{rev}}T,

where T is the absolute temperature of the system, dividing an incremental reversible transfer of heat into that system (δQrev). (If heat is transferred out the sign would be reversed giving a decrease in entropy of the system.) The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic description of the contents of a system. The concept of entropy has been found to be generally useful and has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics.

[..]

In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. Understanding the role of thermodynamic entropy in various processes requires an understanding of how and why that information changes as the system evolves from its initial to its final condition.

It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it.

huge.. or our lack of info about it… isn’t that all of our thinking on dis order and structureless ness.. et al.. disorder/structurelessness.. to who..

[thinking David‘s bits on Jo Freeman.. perhaps not only that we’re missing the potential of being able to scale w/right mechanism.. but that missing it ness is the protection or whatever we need .. in order to break through.. to scale.. to exponentiate to global equity ness]

perhaps.. we let go.. of trying to control/understand it all.. and let it/us dance.

let’s do this firstfree art-ists.

for (blank)’s sake

a nother way

[..]

Entropy is defined for a reversible process and for a system that, at all times, can be treated as being at a uniform state and thus at a uniform temperature. Reversibility is an ideal that some real processes approximate and that is often presented in study exercises. For a reversible process, entropy behaves as a conserved quantity and no change occurs in total entropy. More specifically, total entropy is conserved in a reversible process and not conserved in an irreversible process. One has to be careful about system boundaries.

borders (very border implies the violence of its maintenance) ness

entropy of a system

Entropy is the above-mentioned unexpected and, to some, obscure integral that arises directly from the Carnot cycle. It is reversible heat divided by temperature. It is also, remarkably, a fundamental and very useful function of state.

[..]

The entropy of the thermodynamic system is a measure of how far the equalization has progressed.

[..]

Thermodynamic entropy is a non-conserved state function that is of great importance in the sciences of physics and chemistry. Historically, the concept of entropy evolved in order to explain why some processes (permitted by conservation laws) occur spontaneously while their time reversals (also permitted by conservation laws) do not; systems tend to progress in the direction of increasing entropy. For isolated systems, entropy never decreases. This fact has several important consequences in science: first, it prohibitsperpetual motion” machines; and second, it implies the arrow of entropy has the same direction as the arrow of time. Increases in entropy correspond to irreversible changes in a system, because some energy is expended as waste heat, limiting the amount of work a system can do.

Unlike many other functions of state, entropy cannot be directly observed but must be calculated.

[..]

One dictionary definition of entropy is that it is “a measure of thermal energy per unit temperature that is not available for useful work”. For instance, a substance at uniform temperature is at maximum entropy and cannot drive a heat engine. A substance at non-uniform temperature is at a lower entropy (than if the heat distribution is allowed to even out) and some of the thermal energy can drive a heat engine.

A special case of entropy increase, the entropy of mixing, occurs when two or more different substances are mixed. If the substances are at the same temperature and pressure, there will be no net exchange of heat or work – the entropy change will be entirely due to the mixing of the different substances. At a statistical mechanical level, this results due to the change in available volume per particle with mixing

thinking cage – nature ness..

thinking serendip ity ness.. if we could just let go

______

mess\i\ness

antigragile

hello stigmergy

emergent

chaos

small world network

self-organzing

whimsy

chaordic

embracing uncertainty

redefine decision making – disengage from consensus

et al

et al

________

oct 2016 – consciousness could be a side effect of entropy

http://www.sciencealert.com/consciousness-could-be-a-result-of-entropy-say-researchers

Just like the Universe, our brains might be programmed to maximise disorder – similar to the principle of entropy – and our consciousness could simply be a side effect.

[..]

what if consciousness arises naturally as a result of our brains maximising their information content? In other words, what if consciousness is a side effect of our brain moving towards a state of entropy?

Entropy is basically the term used to describe the progression of a system from order to disorder.

 [..]
Specifically, they were looking at synchronisation of neurons – whether neurons were oscillating in phase with each other – to figure out whether brain cells were linked or not.
[..]

“We find a surprisingly simple result: normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values,” the team writes.

This lead the researchers to argue that consciousness could simply be an “emergent property” of a system that’s trying to maximise information exchange.

Before we get too carried away, there are some big limitations to this work – primarily the small sample size. It’s hard to spot any conclusive trends from only nine people, particularly as everyone’s brains responded slightly differently to the various states.

________

Peter Vander Auwera (@petervan) tweeted at 6:10 AM – 6 Jan 2017 :

Tendrils of Mess in our Brains https://t.co/RJpTIWKOkZ via @ribbonfarm (http://twitter.com/petervan/status/817357641974509570?s=17)

And here is the answer: in order for mess to appear, there must be in the component parts of the mess an implication of extreme order, the kind of highly regular order generally associated with human intention. Flat uniform surfaces and printed text imply, promise, or encode a particular kind of order. In mess, this promise is not kept. The implied order is subverted. Often, as in my mess of text and logos above, the implied order is subverted by other, competing orders.

[..]

Mess is only perceptible because it produces in our minds an imaginary order that is missing.

It is as if objects and artifacts send out invisible tendrils into space, saying, “the matter around me should be ordered in some particular way.” The stronger the claim, and the more the claims of component pieces conflict, the more there is mess. It is these invisible, tangled tendrils of incompatible orders that we are “seeing” when we see mess. They are cryptosalient: at once invisible and obvious.

[..]

Human hair, then, is a locus for the display of order. Its “natural” state is mess, implying that hair comes with an order deficit, requiring organizational effort to come up to the level of acceptable human. Our minds and personalities are similar.

[..]

A great deal of our reality is made from imaginary orders we carry around in our heads

[..]

As human beings, “projecting and sharing stylized model worlds in mental space” is both our ancestral job and our favorite hobby. The world that we interact in is mostly imaginary, constructed by all of us out of fantasies and guesses.

As we get more intelligent, we will get more imaginary.

Advertisements