definitions.. too verbose for my mind.. but this got me (and carhart-harris entropy law):

What is Entropy? – via the science asylum – Nick Lucid

seems like any weird thing we’d measure (energy/momentum)..

1800s – holes in understanding of heat/temp… wasn’t called heat.. called it caloric.. magical fluid that flowed between objects.. no knowledge of atoms

1865 – clausius – inside transformation… and he made it sound like energy on purpose.. then wrote 2nd law of thermo – energy always increases..

1900s – started to understand atoms.. and started modeling them.. started to understand entropy… nothing extra.. nothing missing…  molecules don’t possess entropy but container of water does

entropy is an emergent property – property when whole is larger than sum of parts…..

entropy emerges when enough molecules in place for statistics to be important

3 min – (old school defn) entropy measures the disorder of the energy of a collection of particles

word disorder can be very subjective…

messy room analogy – items represent the energy ..not the particles … org room – low entropy… messy rooom… high entropy.. are you sure.. depends on who lives in room… organized chaos.. if know where everything is… not disordered because it’s all over (the place) it’s disordered because the universe itself has lost track of the energy’s origins…

4 min – it’s like the universe has a witness protection program for its energy and we call that entropy…

whoa. huge..

perhaps like.. idiosyncratic jargon as protection – and gershenfeld something else as protection..  ps in the open et al..

and how Greg writes about dna junk here.. who’s to say what’s junk..

3/14/16 5:58 AM
To Adapt, You Need To Evolve – digitaltonto.com/2016/to-adapt-…

our junk DNA is just a few random mutations away from becoming something important.


2 min – aha with the pool game..

and that’s what we want.. what we’ve always wanted.. just haven’t had the means (ie: mech) to facilitate the chaos of unlimited choices… ie: to live out the rev of everyday life..

3 min – 2nd law of thermo.. entropy stays same or increases.. good ie: life..

hot liquid has higher entropy.. hot water molecules have more movement than cold.. therefore more dis ordered.. ice.. less entropy.. constricted movement...

whoa.. hot more movement..

4 min – only way to lower entropy – releasing heat into room thereby increasing total entropy of surrounding environment..

5 min – why live in universe that continues dis order.. many more high entropy (dis order) states available than low entropy

as much as we try to sustain order.. bound to 2nd law of thermo.


via Hank:

Entropy: Embrace the Chaos! Crash Course Chemistry

infinite other ways…

2nd law: any spontaneous process increases the disorder of the universe

assumed disorder.. no?

processes that don’t increase the disorder.. require work to be done in opposition to the disorder.. in fact often impossible to achieve.. fact of putting one system into order requires others to be dis ordered..

? again.. who’s to say dis order – rather than.. just not my order

2 min – entropy: measure of molecular randomness.. dis order.. helps make chemical reactions possible and helps us predict how much work can be extracted from reactions..

3 min – spontaneous: doesn’t need outside energy – not how quick it happens –

entropy – doesn’t depend on path it took to get to its state

9 min – enthalpy (heat) enthropy (disorder)

12 min – change in entropy depends on how much room molecules have to move around in


adding page because of a lot of things.. but actual day adding it.. because of these tweets:

3/14/16 5:52 AM
“Freeing his time for its more effective exploratory investment is to give man increased wealth.” —R.B.Fuller
“Wealth is anti-entropy at a most exquisite degree of concentration” —R.B.Fuller #degrowth

bucky ness

so.. is wealth anti entropy.. or entropy.. at a ginormous degree…?

then Smári‘s – 4 tweets prior to catching awenneov’s

Smári McCarthy (@smarimc) tweeted at 5:46 AM on Mon, Mar 14, 2016:
I’m fascinated by the way people work with text. Lots of people apply neither rhyme nor reason. Random blocks of text, zero structure, …



degree of disorder.. lack of order.. who’s deciding this..? ie: is it disorder.. or just that we don’t see/get/grok the structure/order

intro to entropy:

wikipedia small

The idea of “irreversibility” is central to the understanding of entropy. Everyone has an intuitive understanding of irreversibility (a dissipative process) – if one watches a movie of everyday life running forward and in reverse, it is easy to distinguish between the two. The movie running in reverse shows impossible things happening – water jumping out of a glass into a pitcher above it, smoke going down a chimney, water in a glass freezing to form ice cubes, crashed cars reassembling themselves, and so on. The intuitive meaning of expressions such as “you can’t unscramble an egg”, “don’t cry over spilled milk” or “you can’t take the cream out of the coffee” is that these are irreversible processes. There is a direction in time by which spilled milk does not go back into the glass.

In thermodynamics, one says that the “forward” processes – pouring water from a pitcher, smoke going up a chimney, etc. – are “irreversible”: they cannot happen in reverse, even though, on a microscopic level, no laws of physics would be violated if they did. All real physical processes involving systems in everyday life, with many atoms or molecules, are irreversible. For an irreversible process in an isolated system, the thermodynamic state variable known as entropy is always increasing. The reason that the movie in reverse is so easily recognized is because it shows processes for which entropy is decreasing, which is physically impossible. In everyday life, there may be processes in which the increase of entropy is practically unobservable, almost zero. In these cases, a movie of the process run in reverse will not seem unlikely. For example, in a 1-second video of the collision of two billiard balls, it will be hard to distinguish the forward and the backward case, because the increase of entropy during that time is relatively small. In thermodynamics, one says that this process is practically “reversible”, with an entropy increase that is practically zero. The statement of the fact that the entropy of the Universe never decreases is found in the second law of thermodynamics.

In a physical system, entropy provides a measure of the amount of thermal energy that cannot be used to do work. In some other definitions of entropy, it is a measure of how evenly energy (or some analogous property) is distributed in a system. Work and heat are determined by a process that a system undergoes, and only occur at the boundary of a system. Entropy is a function of the state of a system, and has a value determined by the state variables of the system.

The concept of entropy is central to the second law of thermodynamics. The second law determines which physical processes can occur. For example, it predicts that the flow of heat from a region of high temperature to a region of low temperature is a spontaneous process – it can proceed along by itself without needing any extra external energy. When this process occurs, the hot region becomes cooler and the cold region becomes warmer. Heat is distributed more evenly throughout the system and the system’s ability to do work has decreased because the temperature difference between the hot region and the cold region has decreased. Referring back to our definition of entropy, we can see that the entropy of this system has increased. Thus, the second law of thermodynamics can be stated to say that the entropy of an isolated system always increases, and such processes which increase entropy can occur spontaneously. The entropy of a system increases as its components have the range of their momentum and/or position increased.

The term entropy was coined in 1865 by the German physicist Rudolf Clausius, from the Greek words en-, “in”, and trope “a turning”, in analogy with energy.


Traditionally, 20th century textbooks have introduced entropy as order and disorder so that it provides “a measurement of the disorder or randomness of a system“. It has been argued that ambiguities in the terms used (such as “disorder” and “chaos”) contribute to widespread confusion and can hinder comprehension of entropy for most students. A more recent formulation associated with Frank L. Lambert describing entropy as energy dispersal.



The concept of thermodynamic entropy arises from the second law of thermodynamics. By this law of entropy increase it quantifies the reduction in the capacity of a system for change, for example heat always flows from a region of higher temperature to one with lower temperature until temperature becomes uniform or determines whether a thermodynamic process may occur.


Example of increasing entropy

Main article: Disgregation

Ice melting provides an example in which entropy increases in a small system, a thermodynamic system consisting of the surroundings (the warm room) and the entity of glass container, ice, water which has been allowed to reach thermodynamic equilibrium at the melting temperature of ice.


It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: …… This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.

As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the δQ/Tover the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice and water was introduced and became a ‘system’ within it.

Origins and uses

Originally, entropy was named to describe the “waste heat,” or more accurately, energy loss, from heat engines and other mechanical devices which ..

could never run with 100% efficiency in converting energy into work.

waste heat


Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level. In the late 19th century, the word “disorder” was used by Ludwig Boltzmann in developing statistical views of entropy using probability theory to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by Werner Heisenberg and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and statistical mechanics.

For most of the 20th century, textbooks tended to describe entropy as “disorder”, following Boltzmann’s early conceptualisation of the motional” (i.e. kinetic) energy of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe entropy as energy dispersal. Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.


wikipedia small

In thermodynamics, entropy (usual symbol S) is a measure of the number of specific realizations or microstates that may realize a thermodynamic system in a defined state specified by macroscopic variables. Most understand entropy as a measure of molecular disorder within a macroscopic system. The second law of thermodynamics states that an isolated system’s entropy never decreases. Such a system spontaneously evolves towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided they increase their environment’s entropy by that increment. Since entropy is a state function, the change in entropy of a system is constant for any process with known initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.

The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process as

\Delta S = \int \frac{\delta Q_\text{rev}}T,

where T is the absolute temperature of the system, dividing an incremental reversible transfer of heat into that system (δQrev). (If heat is transferred out the sign would be reversed giving a decrease in entropy of the system.) The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic description of the contents of a system. The concept of entropy has been found to be generally useful and has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics.


In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. Understanding the role of thermodynamic entropy in various processes requires an understanding of how and why that information changes as the system evolves from its initial to its final condition.

It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it.

huge.. or our lack of info about it… isn’t that all of our thinking on dis order and structureless ness.. et al.. disorder/structurelessness.. to who..

[thinking David‘s bits on Jo Freeman.. perhaps not only that we’re missing the potential of being able to scale w/right mechanism.. but that missing it ness is the protection or whatever we need .. in order to break through.. to scale.. to exponentiate to global equity ness]

perhaps.. we let go.. of trying to control/understand it all.. and let it/us dance.

let’s do this firstfree art-ists.

for (blank)’s sake

a nother way


Entropy is defined for a reversible process and for a system that, at all times, can be treated as being at a uniform state and thus at a uniform temperature. Reversibility is an ideal that some real processes approximate and that is often presented in study exercises. For a reversible process, entropy behaves as a conserved quantity and no change occurs in total entropy. More specifically, total entropy is conserved in a reversible process and not conserved in an irreversible process. One has to be careful about system boundaries.

borders (very border implies the violence of its maintenance) ness

entropy of a system

Entropy is the above-mentioned unexpected and, to some, obscure integral that arises directly from the Carnot cycle. It is reversible heat divided by temperature. It is also, remarkably, a fundamental and very useful function of state.


The entropy of the thermodynamic system is a measure of how far the equalization has progressed.


Thermodynamic entropy is a non-conserved state function that is of great importance in the sciences of physics and chemistry. Historically, the concept of entropy evolved in order to explain why some processes (permitted by conservation laws) occur spontaneously while their time reversals (also permitted by conservation laws) do not; systems tend to progress in the direction of increasing entropy. For isolated systems, entropy never decreases. This fact has several important consequences in science: first, it prohibitsperpetual motion” machines; and second, it implies the arrow of entropy has the same direction as the arrow of time. Increases in entropy correspond to irreversible changes in a system, because some energy is expended as waste heat, limiting the amount of work a system can do.

Unlike many other functions of state, entropy cannot be directly observed but must be calculated.


One dictionary definition of entropy is that it is “a measure of thermal energy per unit temperature that is not available for useful work”. For instance, a substance at uniform temperature is at maximum entropy and cannot drive a heat engine. A substance at non-uniform temperature is at a lower entropy (than if the heat distribution is allowed to even out) and some of the thermal energy can drive a heat engine.

A special case of entropy increase, the entropy of mixing, occurs when two or more different substances are mixed. If the substances are at the same temperature and pressure, there will be no net exchange of heat or work – the entropy change will be entirely due to the mixing of the different substances. At a statistical mechanical level, this results due to the change in available volume per particle with mixing

thinking cage – nature ness..

thinking serendip ity ness.. if we could just let go




hello stigmergy



small world network




embracing uncertainty

redefine decision making – disengage from consensus

et al

et al


oct 2016 – consciousness could be a side effect of entropy


Just like the Universe, our brains might be programmed to maximise disorder – similar to the principle of entropy – and our consciousness could simply be a side effect.


what if consciousness arises naturally as a result of our brains maximising their information content? In other words, what if consciousness is a side effect of our brain moving towards a state of entropy?

Entropy is basically the term used to describe the progression of a system from order to disorder.

Specifically, they were looking at synchronisation of neurons – whether neurons were oscillating in phase with each other – to figure out whether brain cells were linked or not.

“We find a surprisingly simple result: normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values,” the team writes.

This lead the researchers to argue that consciousness could simply be an “emergent property” of a system that’s trying to maximise information exchange.

Before we get too carried away, there are some big limitations to this work – primarily the small sample size. It’s hard to spot any conclusive trends from only nine people, particularly as everyone’s brains responded slightly differently to the various states.


Peter Vander Auwera (@petervan) tweeted at 6:10 AM – 6 Jan 2017 :

Tendrils of Mess in our Brains https://t.co/RJpTIWKOkZ via @ribbonfarm (http://twitter.com/petervan/status/817357641974509570?s=17)

And here is the answer: in order for mess to appear, there must be in the component parts of the mess an implication of extreme order, the kind of highly regular order generally associated with human intention. Flat uniform surfaces and printed text imply, promise, or encode a particular kind of order. In mess, this promise is not kept. The implied order is subverted. Often, as in my mess of text and logos above, the implied order is subverted by other, competing orders.


Mess is only perceptible because it produces in our minds an imaginary order that is missing.

It is as if objects and artifacts send out invisible tendrils into space, saying, “the matter around me should be ordered in some particular way.” The stronger the claim, and the more the claims of component pieces conflict, the more there is mess. It is these invisible, tangled tendrils of incompatible orders that we are “seeing” when we see mess. They are cryptosalient: at once invisible and obvious.


Human hair, then, is a locus for the display of order. Its “natural” state is mess, implying that hair comes with an order deficit, requiring organizational effort to come up to the level of acceptable human. Our minds and personalities are similar.


A great deal of our reality is made from imaginary orders we carry around in our heads


As human beings, “projecting and sharing stylized model worlds in mental space” is both our ancestral job and our favorite hobby. The world that we interact in is mostly imaginary, constructed by all of us out of fantasies and guesses.

As we get more intelligent, we will get more imaginary.


from Krishnamurti‘s total freedom:


this is part of meditation: to see the outer actually as it is, now what you wish it to be, the wars/antagonisms, hatreds… sorrow, pain anxiety, loneliness, lack of love.. to observe all that.. then what takes place? then you will see that energy is being gathered, because there is order and, therefore, there is no wastage of energy..  when there is mathematical order in our daily life, there is not wastage of energy

remember.. he redefined order.. to accept the disorder.. and so.. entropy.. ness



from Nora Bateson‘s small arcs of larger circles

nora bateson (@NoraBateson) tweeted at 11:07 PM on Mon, Jun 25, 2018:
Thank you. https://t.co/7OsWaiBS8v

Mutual learning happens in the entropy; we need the confusion to release the new.”


mostly adding below because of Robin Carhart-Harris – paper and thinking on entropy

from Michael Pollan‘s how to change your mind (esp Robin and Alison bit):

5 – the neuroscience – your brain on psychedelics


‘if the only way we can access the unconscious is via dreams and free association.. we aren’t going to get anywhere.. surely there must be something else’.. he asked his prof if that something else might be a drug.. his prof sent him to read a book called realms of the human unconscious by stanislav grof..


carhart harris thinks that psychedelics render the brain’s usual handshake of perception less stable and more slipper.. he suspects that there are moments during the psychedelic experience when confidence in our usual top down concepts or reality collapses, opening the way for more bottom up info to get thru the filter.. but when all that sensory info threatens to overwhelm us, the mind furiously generates new concepts (crazy or brilliant, it hardly matters) to make sense of ait all – ‘and so you might see faces coming out of the rain..


‘that’s the brain doing what the brain does’ – that is, working to reduce uncertainty by, in effect, telling itself stories..

by adulthood, the brain has gotten very good at observing and testing reality and developing reliable predictions about it that optimize our investment of energy (mental and otherwise) and therefore our chances of survival

really? i don’t buy that

uncertainty is a complex brains’ biggest challenge, and predictive coding evolved to help us reduce it.. in general the kind of precooked or conventionalized thinking this adaptation produces serves us well. but only up to a point

i don’t think it serves us well at all.. only serves whales in sea world well .. at least.. so far as it keeps them there

precisely where that point lies is a question robin carhart -harris and his colleagues have explored in an ambitious and provocative paper titled ‘the entropic brain: ..’ the question at its heart is, do we pay a price for the achievement of order and selfhood in the adult human mind.. the paper concludes that we do.. while suppressing entropy (in this context a synonym for uncertainty) ..the brain ‘serves to promote realism, foresight, careful reflection and an ability to recognize and overcome wishful and paranoid fantasies’ at the same time this achievement tends to ‘constrain cognition’ and exert ‘a limiting or narrowing influence on consciousness’

entropy et al.. antifragility.. et al

[https://www.youtube.com/watch?v=FEsc0j0kmqI – 30 min convo in 2017]

[https://www.youtube.com/watch?v=MZIaTaNR3gk – 16 min tedx 2016]

[https://www.youtube.com/watch?v=O9vNDRGveYs – 1 hr w russel brand 2017]

14 min – on what we have to bring to this indigenous practice.. ability to break it down to understand it

52 min – on hearing other’s stories – seeing thru their eyes

53 min – rosalind on connectedness..

1:02 – one common theme coming out of this.. ‘oh my.. my self was a construction.. built up.. not actually real’.. once that happens.. common thing is .. seeing us as all one


for all his ambition his affect is strikingly self effacing and does little to prepare you for his willingness  to venture out onto intellectual limbs that would scare off less intrepid scientists

the entropy paper asks us to conceive of the mind as an uncertaitny reducing machine w a few serious bugs in it.. the sheer complexity of the human brain and the greater number of different mental states in its repertoire (as compared w other animals) make the maintenance of order a top priority, lest the system descend into chaos..

? maybe it should..?

magical thinking (a much more anarchic – no rules – form of  primary consciousness.. from long ago) is one way for human minds to reduce their uncertainty about the world.. but it is less than optimal for the success of the species

success defined in what way..?

a better way to suppress uncertainty and entropy in the human brain emerge w the evolution fo the default mode network.. carhart harris contends.. a brain regulation system that  is absent or undeveloped in lower animals and young children..  along w the default mode network, ‘a coherent sense of self or ‘ego’ emerges’ and w that the human capacity for self relection and reason..  he calls this more highly evolved mode of cognition.. secondary consciousness.. pays deference to reality and diligently seeks to represent the world as precisely as possible’ in order to minimize ‘surprise and uncertainty (ie entropy)’

whoa.. fromm surprise/spontaneity.. is what keeps us alive.. same too w uncertainty/disorder


the article offers an intriguing graphic depicting a ‘spectrum of cognitive states’ ranging from high entropy mental states to low ones.. at the high entropy end .. he lists psychedelic states; infant consciousness; early psychosis; magical thinking; and divergent or creative thinking. at the low entropy end.. he lists narrow or rigid thinking; addiction; obsessive compulsive disorder; depression ; anesthesia ; and finally, coma.

wow – that’s huge

carhart harris suggests that the psychological ‘disorders’ at the low entropy end of the spectrum are not the result of a lack of order in the brain but rather stem from an excess of order. when the groves of self reflective thinking deepen and harden, the ego becomes overbearing. this is perhaps most clearly evident in depression, when the ego turns on itself and uncontrollable introspection gradually shades our reality..


carhart harris believes  that people suffering from a whole range of disorders characterized by excessively rigid patterns of thought – including addiction, obsessions, and eating disorders as well as depression – stand to benefit from ‘the ability of psychedelics to disrupt stereotyped patterns of thought and behavior by disintegrating the patterns of [neutral] activity upon which they rest’

carhart-harris entropy law:

so it may be that some brains could stand to have a little more entropy, not less

indeed.. yes

this is where psychedelics come in by quieting the default mode network, these compounds can loosen the ego’s grip on the machinery of the mind, ‘lubricating’ cognition where before it has been rusted stuck..


‘psychedelics alter consciousness by disorganizing brain activity’ carhart harris writes.. they increase the amount of entropy in the brain, w the result that the system reverts to a less constrained mode of cognition..

‘it’s not just that one system drops away’ he says ‘ but that an older system reemerges’

yeah.. wow

that older system is primary consciousness, a mode of thinking in which the ego temporarily loses its dominion and the unconscious , now unregulated, ‘is brought into an observable space’.. this, for carhart harris is the heuristic value of psychedelics to the study of the mind, though he sees therapeutic value as well.

worth noting that carhart harris does not romanticize psychedelics and has little patience for the sort of ‘magical thinking’ and ‘metaphysics’ that they nourish in the  acolytes – such as the idea that consciousness is ‘transpersonal’ a property of the universe rather than the human brain..  in his view, the forms of consciousness that psychedelics unleash are regressions to a ‘more primitive’ mode of cognition..  w freud, he believes th tat the loss of self, and the sense of oneness, characteristic of the mystical experience – whether occasioned by chemistry or religion – return us to the psychological condition of the infant on its mother’s breast, a stage when it has yet to develop a sense of itself as a separate and bounded individual..


for carhart harris, the pinnacle of human development is the achievement of this differentiated self, or ego, and its imposition of order on the anarchy of a primitive mind buffeted by fears and wishes and given to various forms of magical thinking.


while he holds w aldous huxley that psychedelics throw open the odors of perception, he does not agree that everything tha comes thru that opening… is necessarily real..  yet.. he also believes there is genuine gold in the psychedelic experience..

too much entropy in the human brain may lead to atavistic thinking and, at the far end, madness, yet too little can cripple us as well..  the grip of an overbearing ego can enforce a rigidity in our thinking that is psychologically destructive..  it may be socially and politically destructive too, in that it closes the mind to info and alt points of view

in one of our convos.. robin speculated that a class of drugs w the power to overturn hierarchies in the mind and sponsor unconventional thinking has the potential to reshape users’ attitudes toward authority of all kinds; that is the compounds may have a political effect. many believe lsd played precisely that role in the political upheaval of the 1960s

‘was it that hippies gravitated to psychedelics, or do psychedelics create hippies..? nixon thought it was the latter.. he may have been right’ .. robin believes that psychedelics may also subtly shift people’s attitudes toward nature, which also underwent a sea change in the 60s..  when the influence of the dmn declines, so does our sense of separateness from our environ..


‘the brain operates w greater flexibility and interconnectedness under hallucinogens’ (a high entropy brain)


when the brain operates under the influence of psilocybin.. thousands of new connections form.. linking far flung brain regions that during normal waking consciousness don’t exchange much info..  in effect, traffic is rerouted from a relatively small number of interstate highways onto myriad smaller roads linking a great many more destinations..  the brain appears to become less specialized (ie: talking mostly w/in silos) and more globally interconnected, w considerably more intercourse, or ‘cross talk’ among its various neighborhoods..

the increase in entropy allows a thousand mental states to bloom, many of them bizarre and senseless, but some number of them revelatory, imaginative, and , at least potentially, transformative..


entropy in brain supplies diversity of raw material on which selection can then operate to solve problems and bring novelty in to the world.. .. and aid to creativity.. to thinking outside the box..


franz vollenweider has suggested that the psychedelic experience may facilitate ‘neuroplasticity’..  but so far .. all highly speculative

carhart harris argues in the entropy paper that even a temporary rewiring of the brain is potentially valuable.. esp for people suffering from disorders characterized by mental rigidity..  disrupting unhealthy patterns of thought and creating a space of flexibility – entropy – in which more salubrious (health giving) patterns and narratives have an opp to coalesce 

oh entropy et al.. antifragility.. et al

am thinking.. a case for no training.. ness

the idea that increasing the amount of entropy in the human brain might actually be good for us is surely counterintuitive

to me.. very intuitive..

carhart-harris entropy law:

most of us bring a negative connotation to the term: entropy suggests a gradual deterioration of a hard won order,

the disintegration of a system over time.. certainly getting older feels like an entropic process – a gradual running down and disordering of the mind and body.. but maybe that’s the wrong way to think about it.. robin’s paper got me wondering if, at least for the mind, aging is really a process of declining entropy, the fading over time of what we should regard as a possible attribute of mental life

i don’ think it’s a process of aging.. i think it’s a process of living a life of supposed to’s.. only natural for ie: whales in seal world..


certainly by middle age, the sway of habitual thinking over the operations of the mind is nearly absolute, by now, i can count on past experience to propose quick and usually serviceable answers to just about any question reality poses, whether it’s about how to soothe a child or mollify a spouse, repair a sentence, accept a compliment, answer the next question, or make sense of whatever’s happening in the world..

who says any of those are working.. not to mention.. good even if they did

w experience and time, it gets easier to cut to he chase and leap to conclusions – clichés that imply a kind of agility but that in fact may signify precisely the opposite: a petrification of thought

indeed.. whales in sea world..

a flattering term for this regime of good enough prediction is ‘wisdom’

a false term

reading robin’s paper helped me better understand what i was looking for when i decided to explore psychedelics: to give my own snow globe a vigorous shaking, see if i could renovate my everyday mental life by introducing a great measure of entropy and uncertainty into it..  to see if it wasn’t too late to skip out of some of the deeper grooves of habit that the been-theres and done-thats of long experience had inscribed on my mind

today we can do it.. for/with 7 bn  – ie: 1 yr to be 5 ness..

entropy et al.. antifragility.. et al


one of the most interesting things about a psychedelic experience is that it sharpens one’s sensitivity to one’s own mental states, esp in the days immediately following..  the usual seamlessness of consciousness is disturbed in such a way as to make any given state – mind wandering, focused attention, rumination  – both more salient and somewhat easier to manipulate..


if the neuroscientists are right, what i’m observing in my mind (spectrum ranging from contraction to expansion)  has a  physical correlate in the brain: the default mode network is either online or off; entropy is either high or low.. what exactly to do w this info i’m not sure yet..

1 yr to be 5 ness..  wake us up.. more highs (so to speak).. meaning.. more wonder, wandering.. whimsy.. eudaimonia

by now , it may be lost to memory, bu tall of us , even the pyschedelically naive, have had direct personal experience of an entropic brain and the novel type of consciousness it sponsors – as a young child..

1 yr to be 5 ness

baby consciousness is so diff from adult consciousness as to constitute a mental country of its own, one form which we are expelled sometime early in adolescence..

not yet scrambled.. to schooled.. not a natural/humane progression

is there a way back in?

yes.. today.. we can get to a nother way to live.. via mech (2 convers as infra..as it could be..).. that would allow all of us to leap at once.. to reset .. us..  in sync

talk to me man

the closest we can come to visit that foreign land as adults maybe during the psychedelic journey..  this at least is the startling hypothesis of alison gopnik.. who happens to be a colleague of main at berkely

there’s a nother way – (ie: findings from experimenting with it for 10+ years) –


alison and robin come at problem of consciousness from what seem like completely diff directions and disciplines, but soon after they learned of each other.. they struck up a convo that has proven to be remarkably illuminating.. at least for me.. in april 2016.. their convo wound up on stage at a conference on consciousness in arizona..  where they met for first time


both offer ‘altered state’..  that in a number of respects is a strikingly similar one..  she (alison) cautions that our thinking about the subject is usually constrained by our own restricted experience of consciousness, which we *naturally take to be the whole of it..

not naturally.. schooled to take it that way

she calls ‘professor consciousness’ .. ‘the phenomenology of your avg middle aged prof’

all of us really..whales in sea world..

‘if you thought , as people often have, that this was all there was to consciousness you might very well find yourself thinking that young children were actually y less conscious that we were’.. because both focuses attention and self reflection are absent in young children.. gopnik asks us to think about child consciousness in terms of not what’s missing from it or undeveloped but rather what is uniquely and wonderfully present – qualities that she believes psychedelics can help us to better appreciate and.. possibly.. re experience

cure ios city.. as detox

adults – spotlight/ego consciousness of adults.. .. w a point/goal..  vs lantern consciousness of children.. attention more widely diffused allowing the child to take in info from virtually anywhere  (by this measure, children are more conscious than adutls, rather than less)..

being *inexperienced in the way of the world, the mind of the young child has comparatively fewer priors, or preconceptions, to guide her perception down the predictable tracks. instead, the child approaches reality w the astonishment of an adult on psychedelics..

rather.. *inexperienced in the ways of sea world.. huge diff


gopnik believes that both they young child *(5 and under) and the adult on a psychedelics have a stronger predilection for the high temp search; in the quest to make sense of things, their minds explore not just nearby and most likely both ‘the entire space of possibilities’..

*before schooled.. hence.. the need for 1 yr to be 5 ness

these high temp searches might be inefficient.. higher rate of error.. require more time/energy.. yet there are times.. only ay to solve a problem

actually.. if we let go of all the supposed to’s.. (this isn’t a mechanical/efficeincy problem) .. we’d have the time/energy.. (not to mention the regenerating energy  from living this way) to ie: follow our whimsy/wonder.. everyday

the ai humanity needs..augmenting interconnectedness.. of 7bn alive people

meadows undisturbed ecosystem

gopnik has tested this hypothesis on children in her lab and has found that there are learning problems that 4 yr olds are better at solving that adults.. these are precisely the kinds of problems that require thinking outside the box..

rather.. that require thinking.. once you have a box.. not so much thinking.. as looking for right fits

ie: kids getting calaculus thinking.. ie: mathematical thinking .. more than hs/college/prof


the short summary is, babies and children are basically tripping all the time

high on life.. as we all should/could be


Shane Parrish (@farnamstreet) tweeted at 5:02 AM – 26 Nov 2018 :
Battling Entropy: Making Order of the Chaos in Our Lives https://t.co/a0DzGprbaQ (http://twitter.com/farnamstreet/status/1067025854855823360?s=17)

“The … ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order.” — Steven Pinker

no wonder – it’s his purpose in life..

carhart-harris entropy lawit may be that some brains could stand to have a little more entropy, not less

let go – of that hard won order

Disorder is not a mistake; it is our default. Order is always artificial and temporary.

begs we embrace the uncertainty in cure ios city

we .. as some of the things that gain from disorder

The existence of entropy is what keeps us on our toes.

what we need most: the energy of 7bn alive people


from david graeber‘s anarchy in manner of speaking:


DG: This might sound silly, but I’ve always been a little suspicious of the second law of thermodynamics. I’m not denying that the principle of entropy applies within a closed system. Obviously it does. But it certainly doesn’t apply to any of the systems we care about the most. . t Neither the earth, since we have the sun feeding us energy continually, nor the universe as a whole, which has obviously become more complex and organized since the Big Bang. Okay, so self-contained chemical systems tend to become disorganized over time. So? What are we to make of a law where everything important that happens is an exception?.t

huge.. yeah that.. carhart-harris entropy law et al

DG: I’ve always felt the law of entropy was invented by depressed Victorians anticipating the inevitable decline of their empire. It’s the sigh of the notparticularly-oppressed creature, indignant that his power won’t last forever, since nothing does. You put your bird in a cage, then complain it’s going to die. Get over it!

But to get back to Bhaskar, since I didn’t quite finish my summary. What he’s saying is that you have these different emerging levels of complexity, and not only does each one have a greater degree of freedom (or arbitrariness, from the perspective of determination) but how they interact in an open system is inherently unpredictable, because you have causative mechanisms from different emergent levels interacting. That’s why you need to have a scientific experiment, therefore eliminating mechanisms from all but one emergent level, to understand how any one mechanism works. Closed systems are always human creations and they typically require an enormous amount of work..t

again on entropy ness and takes a lot of work ness