To arrive at the edge of the world’s knowledge, seek out the most complex and sophisticated minds, put them in a room together, and have them ask each other the questions they are asking themselves.

talk about the questions you ask yourself..

The online salon at Edge.org is a living document of millions of words charting the Edge conversation over the past fifteen years wherever it has gone. It is available, gratis, to the general public.

Edge.org was launched in 1996 as the online version of “The Reality Club,” an informal gathering of intellectuals that met from 1981-1996 in Chinese restaurants, artist lofts, the Board Rooms of Rockefeller University, the New York Academy of Sciences, and investment banking firms, ballrooms, museums, living rooms, and elsewhere. Though the venue is now in cyberspace, the spirit of the Reality Club lives on in the lively back-and-forth discussions on the hot-button ideas driving the discussion today.

In the words of the novelist Ian McEwan, Edge.org offers “open-minded, free ranging, intellectually playful … an unadorned pleasure in curiosity, a collective expression of wonder at the living and inanimate world … an ongoing and thrilling colloquium.”


As the late artist James Lee Byars and I once wrote: “To accomplish the extraordinary, you must seek extraordinary people.” At the center of every Edge publication and event are remarkable people and remarkable minds. Edge, at its core, consists of the scientists, artists, philosophers, technologists, and entrepreneurs who are at the center of today’s intellectual, technological, and scientific landscape.

Through the years, Edge.org has had a simple criterion for choosing contributors. We look for people whose creative work has expanded our notion of who and what we are. A few are bestselling authors or are famous in the mass culture. Most are not. Rather, we encourage work on the cutting edge of the culture, and the investigation of ideas that have not been generally exposed. We are interested in “thinking smart;” we are not interested in the anesthesiology of received “wisdom.” The motto is “to arrive at the edge of the world’s knowledge, seek out the most complex and sophisticated minds, put them in a room together, and have them ask each other the questions they are asking themselves.” In communications theory information is not defined as data or input but rather as “a difference that makes a difference.” It is this level we hope our contributors will achieve.

TheEdge community consists of people who are out there doing it rather than talking about and analyzing the people who are doing it.

first intro to edge was via Johsua Knobe‘s talk (via Maria‘s post)


find/follow edge:

link twitter

founder: John Brockman


hmm. thought i recorded edge 2015 question.

here’s 2016:




Energy Of Nothing

Theoretical Physicist, Stanford; Father of Eternal Chaotic…
Back in 1998, two groups of astrophysicists studying supernova made one of the most important experimental discoveries of the 20th century: They found that empty space, vacuum, is not entirely empty. Each cubic centimeter of empty space contains about 10-29grams of invisible matter, or, equivalently, vacuum energy. This is almost nothing, 29 orders of magnitude smaller than the mass of matter in a cubic centimeter of water, 5 orders of magnitude smaller than the proton mass. If the whole Earth would be made of such matter, it would weight less than a gram.
Founder, The Whole Earth Catalog; Co-founder, The Well; Co-…
Most importantly, Anthony James at UC Irvine and colleagues showed that malaria mosquitoes could be altered with gene drive so that they no longer carry the disease. Kevin Esvelt is developing a project to do the same with white-footed mice, which are the wildlife reservoir for Lyme disease in humans; if they are cured, humans will be as well.
Hobbs Professor of Cognition and Education, Harvard…
As Marshall McLuhan argued, technology extends our senses—it does not fundamentally change them. Once one begins to alter human DNA—for example, through CRISPR—or the human nervous system—by inserting mechanical or digital devices—then we are challenging the very definition of what it means to be human. And once one cedes high level decisions to digital creations, or these artificially intelligent entities cease to follow the instructions that have been programmed into them and rewrite their own processes, our species will no longer be dominant on this planet.
Professor of Computer Science, MIT; Director, Human…

In 2014 a group of big data scientists (including myself), representatives of big data companies, and the heads of National Statistical Offices from nations in both the northern and southern hemispheres, met within the United Nations headquarters and plotted a revolution. We proposed that all of the nations of the world begin to measure poverty, inequality, injustice, and sustainability in a scientific, transparent, accountable, and comparable manner. Surprisingly, this proposal was approved by the UN General Assembly in 2015, as part of the 2030 Sustainable Development Goals.


As our UN Data Revolution report, titled “A World That Counts,” states:

Data are the lifeblood of decision-making and the raw material for accountability. Without high-quality data providing the right information on the right things at the right time, designing, monitoring and evaluating effective policies becomes almost impossible. New technologies are leading to an exponential increase in the volume and types of data available, creating unprecedented possibilities for informing and transforming society and protecting the environment. Governments, companies, researchers and citizen groups are in a ferment of experimentation, innovation and adaptation to the new world of data, a world in which data are bigger, faster and more detailed than ever before. This is the data revolution.

perhaps right data – self talk as data… app/chip ness.. for (blank)’s sake


It is not because anyone hopes that the UN will manage or fund the measurement process. Instead we believe that uniform, scientific measurement of human development will happen because international development donors are finally demanding scientifically sound data to guide aid dollars and trade relationships. 

or perhaps.. sound data to make measuring transactions irrelevant….


Historically we have always been blind to the living conditions of the rest of humanity; violence or disease could spread to pandemic proportions before the news would make it to the ears of central authorities. We are now beginning to be able to see the condition of all of humanity with unprecedented clarity. Never again should it be possible to say “we didn’t know.” No one should be invisible. This is the world we want—a world that counts.

or perhaps.. no one is invisible in a world that one doesn’t count..

Media Analyst; Documentary Writer; Author, Present Shock
To be sure, science has brought some of this on itself, by refusing to admit the possibility of any essence to existence, and by too often aligning with corporate efforts to profit off discoveries with little concern for their long-term impact on human well-being.
We have evolved into the closest things to gods this world has ever known, yet a majority of us have yet to acknowledge the actual processes that got us to this point.
..if these abilities are seized upon as something other than the fruits of science, and if they are applied with utter disregard to the scientific context through which they were developed, I fear we will lack the humility required to employ them responsibly.
Physicist, Director, MIT’s Center for Bits and Atoms;…
The unseen scientific story is to break the historical relationship between work and wealth by removing the boundary between the digital and physical worlds.
.. emerging research is replacing processes that continuously deposit or remove materials with ones that code the reversible construction of discrete building blocks. This is being done across disciplines and length scales, from atomically-precise manufacturing, to whole-genome synthesis of living cells, to the three-dimensional integration of functional electronics, to the robotic assembly of modular aircraft and spacecraft. Taken together, these add up to programming reality—turning data into things and things into data.
reversible..? England ness..? for living things…
Returning to the news stories from 2015, going to work commonly means leaving home to travel to somewhere you don’t want to be, to do something you don’t want to do, producing something for someone you’ll never see, to get money to pay for something that you want. What if you could instead just make what you want?
In the largest-ever gathering of heads of state, the Sustainable Development Goals were launched at the UN in 2015. These target worthy aims including ending poverty and hunger, ensuring access to healthcare and energy, building infrastructure, and reducing inequality. Left unsaid is how to accomplish these, with an assumption that it will require spending vast amounts of money to meet them. But development does not need to recapitulate the industrial revolution; just as developing countries have been able to skip over landlines and go right to mobile phones, mass manufacturing with global supply chains can be replaced with sustainable local on-demand fabrication of all of the ingredients of a technological civilization. This is a profound challenge, but it’s one with a clear research roadmap, and is the scientific story behind the news.
Author; The Cancer Chronicles, The Ten Most Beautiful…

Cancer is often described as a sped-up version of Darwinian evolution. ..becomes fitter and fitter within the ecosystem of your body. Some of the mutations are inherited while others are environmental—the result of a confusion of outside influences. Much less talked about is a third category: the mutations that arise spontaneously from the random copying errors occurring every time a cell divides.

In a paper this year in Science, Cristian Tomasetti and Bert Vogelstein calculated that two-thirds of the overall risk of cancer may come from these errors—entropic “bad luck.” The paper set off a storm of outrage among environmentalists and public health officials, many of whom seem to have misunderstood the work or deliberately misrepresented it.

carcinogen – a substance capable of causing cancer in living tissue.
entropy – 1/physics – a thermodynamic quantity representing the unavailability of a system’s thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system. 2\ lack of order or predictability; gradual decline into disorder.

As epidemiology marches on, the link between cancer and carcinogen seems ever fuzzier. The powerful and unambiguous link between smoking and lung cancer almost seems like a fluke.

It will be interesting to see how this plays out. But meanwhile I hope that more of the public is beginning to understand that getting cancer usually doesn’t mean you did something wrong or that something bad was done to you. Some cancer can be prevented and some can be successfully treated. But for multicellular creatures living in an entropic world, a threshold amount of cancer is probably inevitable.

Physics writer; Author, Trespassing on Einstein’s Lawn
Physicists have spent the last 100 years attempting to reconcile Einstein’s theory of general relativity, which describes the large-scale geometry of spacetime, with quantum mechanics, which deals with the small-scale behavior of particles. It’s been slow going for a century, but now, suddenly, things are happening.
Computational complexity allows general relativity and quantum mechanics to peacefully coexist.
general relativity, also known as the generaltheory of relativity, is the geometric theory of gravitation published by Albert Einstein in 1915 and the current description of gravitation in modern physics.
quantum mechanics the branch of mechanics that deals with the mathematical description of the motion and interaction of subatomic particles, incorporating the concepts of quantization of energy, wave-particle duality, the uncertainty principle, and the correspondence principle.

Hayden and Harlow’s work connects physics and computer science in a totally unprecedented way. Physicists have long speculated that information plays a fundamental role in physics. It’s an idea that dates back to Konrad Zuse, the German engineer who built the world’s first programmable electronic computer in his parent’s living room in 1938, and the first universal Turing machine three years later. In 1967, Zuse wrote a book calledCalculating Space, in which he argued that the universe itself is a giant digital computer. In the 1970s, the physicist John Wheeler began advocating for “it from bit”—the notion that the physical, material world is, at bottom, made of information. Wheeler’s influence drove the burgeoning field of quantum information theory and led to quantum computing, cryptography and teleportation. But the idea that computational complexity might not only describe the laws of physics, but actually uphold the laws of physics, is entirely new.


That such constraints were at the heart of both theories led thinkers such Arthur Stanley Eddington to suggest that at its deepest roots, physics is epistemology.

the theory of knowledge, especially with regard to its methods, validity, and scope. Epistemology is the investigation of what distinguishes justified belief from opinion.

Professor Emerita, George Mason University; Visiting…
the more significant achievement of cybernetics was a new way of thinking about causation, now more generally referred to as systems theory. Listen to the speeches of politicians proclaiming their intent to solve problems like terrorism—it’s like asking for a single pill that will “cure” old age—if you don’t like x, you look for an action that will eliminate it, without regard to the side effects (bombing ISIL increases hostility for example) or the effects on the user (consider torture). Decisions made with overly simple models of cause and effect are both dangerous and unethical.
Why do we believe that violence is a solution?
a nother way – deep/systemic enough – for (blank)’s sake
Molecular Biologist
Perhaps unsurprisingly, these markers of the neuroimmune system are disrupted in disorders such as depression, anxiety, stroke, Alzheimer’s disease, Parkinson’s disease, and multiple sclerosis. Cytokine levels have been shown to vastly increase during depressive episodes, and—in people with bipolar disorder—to drop off in periods of remission. Even the stress of social rejection or isolation causes inflammation, leading to the fascinating idea that depression could be viewed as a physiological allergic reaction, rather than simply a psychological condition.
With this knowledge comes power: modulating the immune system to our advantage is a burgeoning field of research, particularly for cancer
Siddhartha ness..ie: perhaps preventative & detoxative – chip as rna ish – getting us back to a healthy us
Director, MIT Media Lab
We don’t know exactly how FMTs work, other than that the introduction of microbiota (poop) from a healthy individual somehow causes the gut of an afflicted patient to regain its microbial diversity and rein in the rampant Clostridium difficile.
Siddhartha ness..ie: perhaps preventative & detoxative – chip as rna ish
It appears that our gut microbes produce a wide variety of neurotransmitters that influence our brains, and vice versa, much more than previously believed. There is evidence that, in addition to mood, a number of brain disorders may be caused by microbial imbalance. The evidence is so strong that FMT banks such as OpenBiome have started screening donors for psychiatric problems in addition to a wide variety of health issues. Consequently, it is now harder to qualify as a donor to a fecal bank than it is to get into MIT or Harvard.
microbes more abundant in the human body than human cells,
There is increasing evidence that allergies and many modern ailments have come into existence only after the invention of modern hygiene.
microbes in the soil appear to be an essential part of the system
csu guy on dirt et al
systemic change. deep/quiet enough.
Research affiliate, MIT Media Lab
newly dominant approach, originally known as “neural networks,” is now branded “deep learning,” to emphasize a qualitative advance over the neural nets of the past. Its recent success is often attributed to the availability of larger datasets and more powerful computing systems,
imagining deep learning w/deep datasets… (small-world networks that matter – per choice – per whimsy)
with 7 billion people.
everyday. as the day.
So what is the magic that separates deep learning from the rest, and can crack problems for which no group of humans has ever been able to program a solution? The first ingredient, from the early days of neural nets, is a timeless algorithm, rediscovered again and again, known in this field as “back-propagation”. It’s really just the chain rule—a simple calculus trick—applied in a very elegant way. It’s a deep integration of continuous and discrete math, enabling complex families of potential solutions to be autonomously improved with vector calculus.
The key is to organize the template of potential solutions as a directed graph (e.g., from a photo to a generated caption, with many nodes in between). Traversing this graph in reverse enables the algorithm to automatically compute a “gradient vector,” which directs the search for increasingly better solutions. You have to squint at most modern deep learning techniques to see any structural similarity to traditional neural networks, but behind the scenes, this back-propagation algorithm is crucial to both old and new architectures.
is that not locking us in .. to an agenda.. even if it’s ours..? ie: the need to change your mind everyday..
also thinking England‘s – can’t go in reverse if alive
The other key piece of magic in every modern architecture is another deceptively simple idea: components of a network can be used in more than one place at the same time.

Many of the most successful architectures of the past couple years reuse components in exactly the same patterns of composition generated by common “higher-order functions” in functional programming. This suggests that other well-known operators from functional programming might be a good source of ideas for deep learning architectures.

great if fractal ing… not so great if perptuate\ing

so begs datasets be.. deeper.. deep enough

imagine…. deep enough for all of us.. ie: self talk as data… app/chip ness


The most natural playground for exploring functional structures trained as deep learning networks would be a new language that can run back-propagation directly on functional programs. ….Grefenstette et al. recently published differentiable constructions of a few simple data structures (stack, queue, and deque), which suggests that further differentiable implementations are probably just a matter of clever math. Further work in this area may open up a new programming paradigm—differentiable programming. Writing a program in such a language would be like sketching a functional structure with the details left to the optimizer; the language would use back-propagation to automatically learn the details according to an objective for the whole program

a new language.. or 7 bill idiosyncratic jargon/languages.. stacked et al.. the not knowing keeping us alive.. the having to get to know to know.. keeping us human/kind/antifragilite
Philosophy and Economic Theory, the New School for Social…
good on experimenting with decentralization.. toward a diff way to be..
but if focus is still on measured transactions et al.. not new really. no?
adding more here (in may) from Michel share:

RT @CaterinaRindi: @LaBlogga piece: #Crypto Enlightenment and the Social Theory of #Blockchains https://t.co/quhPQWdKCf #socialtheorists

Original Tweet: https://twitter.com/mbauwens/status/738277788969537536

allowing us to re-explore our reality, and specify it as more internally-determined
blockchain technology, which is a distributed ledger, a decentralized *computational memory of human interactions
not sure it has to be computational.. perhaps what’s messing us up..
The real invitation and potentiality of blockchain technology is to radically rethink reality – what is it to decentralize everything we do and reconstitute life through a frame of abundance and immanence, attending to what is possible and desirable mindfully, not merely a reaction to a reality which seems determined by scarcity.
we as individuals now taking self-responsibility for many activities such as *deciding what and how we consume news media, entertainment, financial services, (stock-trading, credit services, portfolio management), and health services. Next is economic and governance systems.
or deciding to engage from such things as stock trading credit services and portfolio management
Immanence is the idea of self-determination from within; everything comes from within in a system, world, or person; structure and content are emergent and not pre-specified. Immanence contrasts with transcendence where everything comes from outside a system, world, or person; pre-determining the system externally per fixed specifications.
Scientist; Inventor; Entrepreneur; Investor

Examining these advances collectively, the average elapsed time between key algorithm proposals and corresponding advances was about eighteen years, whereas the average elapsed time between key dataset availabilities and corresponding advances was less than three years, or about six times faster, suggesting that datasets might have been limiting factors in the advances. In particular, one might hypothesize that the key algorithms underlying AI breakthroughs are often latent, simply needing to be mined out of the existing literature by large, high-quality datasets and then optimized for the available hardware of the day. Certainly, in a tragedy of the research commons, attention, funding, and career advancement have historically been associated more with algorithmic than dataset advances.

or perhaps imagine.. .. we think more deeply about what data sets to focus on… deep enough for all of us.. ie: self talk as data… app/chip ness

.. For example, we might already possess the algorithms and hardware that will enable machines in a few years to author human-level long-form creative compositions, complete standardized human examinations, or even pass the Turing Test, if only we trained them with the right writing, examination, and conversational datasets. Additionally, the nascent problem of ensuring AI friendliness might be addressed by focusing on dataset rather than algorithmic friendliness—a potentially simpler approach.

indeed. simpler. and perhaps.. toward a better us.. rather than a machine passing turing test .. ness

Although new algorithms receive much of the public credit for ending the last AI winter, the real news might be that prioritizing the cultivation of new datasets and research communities around them could be essential to extending the present AI summer.

Archaeologist; Journalist; Author, Artifacts, Past Poetic
Psychologist & Computer Scientist; Engines for…

We are in the keyword stage of advertising. We are being told that this is science; IBM’s Watson is doing deep learning. Don’t be fooled. It is all key word search and there is no science behind it. Directed advertising is all about keywords. Anything you type online is being tracked, by a machine that can count. No science going on.

So what is the good news?

Having someone (or something) track you might not such a bad thing. We like it when a map program knows where we are and we can figure out how to get where we are going. Many people like hook up sites that tell you who is near by whom you might like. But, here again, no science. There could be science. Hook up sites might figure out whom you might like who is near by and tell you what you have in common to discuss. Will this happen? We are not that far way from it. We would need computer that knew about you the way a friend does (as opposed to your web surfing habits).

or maybe better today… since most of us are not us..

perhaps just focusing on – self talk as data… app/chip ness

Let’s move on to something more serious. My stomach hurts. I tell this to my wife and she suggests a medicine in the cabinet that she remembers I have used before and reminds me that it helped. Now, suppose that this was not my wife but a computer? Is it an ad? Does it matter? Can we do this. Yes. AI technology could easily employ models of people and there needs. (But, today, we are busy with key words.)

… It requires indexing stories the way people do to get reminded. We have programs that do this already. (But, sad to day, this is not on the agenda of commercial entities in AI just yet.)

Very soon AI programs will be good enough (not because they analyze key words or do “deep learning”) but because they can model situations and can match situations to what people have said about those situations. Imagine a video data base of hundreds of thousands of experts. Well “how would I search through all those stories?” is the natural question. We ask that question because searching is an everyday activity now and it has taught us to believe in search and every one selling AI espouses the usefulness of key words.

But it is not key words that will cause this breakthrough. There is too much information to search through and often what we need isn’t there in the first place. But this is not actually a search problem. It is a problem not unlike the getting the right ad to the right person at the right time problem. It is a question getting computers to have a model of what you are doing, what your goals are, and matching that to what help they might have to offer.

rev of everday life.. a story about people grokking what matters

… And, of course, they don’t have to actually be your friends. They can be the best and the brightest, pre-recorded and found with no effort just in time. We understand enough of the science to do this now, Maybe, soon we will get tired of ads and start working on important things in AI.


The Dematerialization Of Consumption

Executive Creative Director and Vice-Chairman, OgilvyOne London; Vice-Chairman, Ogilvy & Mather UK; Columnist, the Spectator

More and more economic value is becoming entirely divorced from the physical attributes of a thing, and resides in intangibles.

The great thing about intangible value, I suppose, is that its creation involves very little environmental damage. It may help disabuse people of the belief that the only way to save the planet is for us to impoverish ourselves. What it may mean is that those same human qualities of status-rivalry and novelty-seeking which can be so destructive might be redirected even if they cannot be eliminated.


hope that’s not our best


Open Water–The Internet Of Visible Thought

Through the work of Mary Lou Jepsen, I was introduced to the potential of brain reading devices and that patterns generated while watching a succession of very varied videos would provide the fundamental elements to connect thought to image.

And so, here we are: our thoughts themselves are about to take a leap out of our heads: from our brains to computer, to the Internet and to the world. We are entering the Age of Visible (and Audible) Thought. This will surely affect human life as deeply as any technology our imagination has yet devised or any evolutionary advance.

The essence of who we are is contained in our thoughts and memories, which are about to be opened like tin cans and poured onto a sleeping world.


The emergence of this suite of technologies will have enormous impact on the everyday ways we live and interact and can clearly transform, positively and negatively, our relationships, aspirations, work, creativity, techniques for extracting information.

Those not comfortable swimming in these transparent waters are not going to flourish. Perhaps we will need to create “swimming lessons” to teach us how to be comfortable being open, honest and exposed—that we can be ready to float and navigate in these waters of visible thought.

One major difference is that as thought becomes closer and closer to action, with shorter feedback loops accelerating change, time scales collapse and the cosy security blanket of a familiar slowness evaporates.
shorten time between intention and action – lag time – everyday. as the day.

Barriers between imagination and reality are about to burst open.

even more so.. and more human.. if we do this firstfree art-ists.

for (blank)’s sake

a nother way


Datasets Over Algorithms

Scientist; Inventor; Entrepreneur; Investor


if only we trained them with the right writing, examination, and conversational datasets.

indeed, ie: self-talk as data




Joi Ito — Q : What scientific term or concept ought to be more widely know? A: Neurodiversity joi.ito.com/weblog/2017/01…


Physicist, UC Berkeley; Author, Physics for Future…
Writer; Speaker; Thinker, Copenhagen, Denmark

The word allostasis means a changing state, where homeostasis means staying in about the same state. The idea of allostasis is that the organism will change its inner milieu to meet the challenge from the outside. Blood pressure is not constant, but will be higher if the organism has to be very active and lower if it does not have to.

Constancy is not the ideal. The ideal is to have the relevant inner state for the particular outer state.

Associate Professor of Psychology, UC Berkeley
Theoretical physicist
This hierarchical sequence of strata, from low to high, is not exact or linear—other fields, such as computer science and environmental science, branch in and out depending on their relevance, and mathematics and the constraints of physics apply throughout. But the general pattern of emergence in a sequence is clear: at each higher level, new behavior and properties appear which are not obvious from the interactions of the constituent entities in the level below, but do arise from them.
Professor of Environmental Engineering, UNIST; Director,…
It is said in the Doctrine of the Mean, written by the grandson of Confucius, that the greatest knowledge, including both scientific concepts and human realizations, comes only from the everyday lives with an empty mind
Global Publishing Director, SAGE; Author, Intimacy:…
Professor of Physics, University of California, Santa Cruz
This creation of indexical information by pointing to what is important to us underlies many creative endeavors.
deep address ness.. toward hosting-life-bits via self-talk as data
Physicist, Director, MIT’s Center for Bits and Atoms;…

Research progress is commonly expected to meet milestones. But a milestone is a marker that measures distance along a highway. To find something that’s not already on the map, you need to leave the road and wander about in the woods beside it. The technical term for that is a biased random walk, which is how bacteria search for gradients in chemical concentrations. The historical lesson is just how reliable that random process of discovery is.

The essential misunderstanding between scientists and non-scientists is the perception that scientific knowledge emerges in a stately procession of received knowledge. As a result, ambiguity isn’t tolerated, and changing conclusions are seen as a sign of weakness. Conversely, scientists shouldn’t defend their beliefs as privileged; what matters is not where they come from, but how they’re tested.

Science only appears to be goal-directed after the fact; while it’s unfolding it’s more like a chaotic dance of improvisation than a victory march. Fire away with your guesses, then be sure to aim.

Archaeologist; Journalist; Author, Artifacts, Past Poetic

. So the potency of the edge of things, the not quite-ness, appears to be dwelling in the poetry of ambiguity, but it chimes with so much of science which dwells in the periphery, and the stunning space of almost-ness.

As Turner suggested: “Prophets and artists tend to be liminal and marginal people, “edgemen,” who strive with a passionate sincerity to rid themselves of the clichés associated with status incumbency and role-playing and to enter into vital relations with other men in fact or imagination. In their productions we may catch glimpses of that unused evolutionary potential in mankind which has not yet been externalized and fixed in structure.’

Media Analyst; Documentary Writer; Author, Throwing Rocks…

Chronobiology is the science of the biological clocks, circadian rhythms, and internal cycles that regulate our organs, hormones, and neurotransmitters.

Divorced from these natural cues, we experience the dis-ease of organ systems that have no way to sync up, and an increased dependence on artificial signals for when to do what. We become more at the mercy of artificial cues—from news alerts to the cool light of our computer screens—for a sense of temporality.

If we were to become more aware of chronobiology, we wouldn’t necessarily have to obey all of our evolutionary biases. Unlike our ancestors, we do have light to read at night, heat and air-conditioning to insulate ourselves from the cycle of the seasons, and 24-7 businesses that cater to people on irregular schedules. But we would have the opportunity to reacquaint ourselves with the natural rhythms of our world and the grounding, orienting sensibilities that come with operating in sync or harmony with them.

A rediscovery and wider acknowledgment of chronobiology would also go a long way toward restoring the solidarity and interpersonal connection so many of us are lacking without it. As we all became more aware and respectful of our shared chronobiology, we would be more likely to sync up or even “phase lock” with one another, as well. This would allow us to recover some of the peer-to-peer solidarity and social cohesiveness that we’ve lost to a culture that treats the time like a set of flashing numbers instead of the rhythm of life.


Technology Forecaster; Consulting Associate Professor,…
Managing Director, Excel Venture Management; Co-author (…
the fundamental question paleoneurology seeks to address, “How do brains change over time?” goes straight to the core of why we are human.
Paleoneurology should retool itself to focus on changes occurring in far shorter timespans, on the rapid rewiring that can result in explosions of autism, on impacts of drastic changes in diet, size, and weight. We need a historic context for the evolution that occurs as our core brain inputs shift from observing nature to reading pages and then digital screens. We have to understand what happens when brains that evolved around contemplation, observation, boredom, interrupted by sudden violence, are now bombarded from every direction as our phones, computers, tablets, TVs, tickers, ads, and masses of humans demand an immediate assessment and response. We are de facto outsourcing and melding parts of our memories with external devices, like our PDAs.
Chan Soon-Shiong Professor of Medicine, Columbia University…
Author and Essayist, New York Times. New Yorker, Slate;…

Science is supposed to be about an objective world. Yet our observations are inherently subjective, made from a particular frame of reference, point of view. How do we get beyond this subjectivity to see the world as it truly is?

Through the idea of invariance. To have a chance of being objective, our theory of the world must at least be intersubjectively valid: It must take the same form for all observers, regardless of where they happen to be located, or when they happen to be doing their experiment, or how they happen to be moving, or how their lab happens to be oriented in space (or whether they are male or female, or Martian or Earthling, or…). Only by comparing observations from all possible perspectives can we distinguish what is real from what is mere appearance or projection.

deep enough ness – problem that would resonate with 7 bn today..

So why aren’t we hearing constantly about Einstein’s theory of invariance? Well, “invariant theory” is what he later said he wished he had called it. And that’s what it should have been called, since invariance is its very essence. The speed of light, the laws of physics are the same for all observers. They’re objective, absolute—invariant. Simultaneity is relative, unreal.

But no. Einstein had to go and talk about the “principle of relativity.” So “relativity”—and not its opposite, “invariance”—is what his revolutionary theory ended up getting labeled. Einstein’s “greatest blunder” was not (as he believed) the cosmological constant after all. Rather, it was a blunder of branding—one that has confused the public for over a century now and empowered a rum lot of moral relativists and lit-crit Nietzscheans.

Associate Professor, MIT Media Lab; Author, Why Information…

In physics we say a system is in a critical state when it is ripe for a phase transition.

as Richard Feynman said repeatedly: The imagination of nature is often larger than that of man. So, maybe our obsession with individual narratives is nothing but a reflection of our limited imagination. Going forward we need to remember that systems often make individuals irrelevant. Just like none of your cells can claim to control your body, society also works in systemic ways.

Physician and Social Scientist, Yale University; Co-author…

My reasons for thinking that this concept ought to be more widely known is that equipoise carries with it aspects of science that remain sorely needed these days. It connotes judgment—for it asks what problems are worthy of consideration. It connotes humility—for we do not know what lies ahead. It connotes open vistas—because it looks out at the unknown. It connotes discovery—because, whatever way forward we choose, we will learn something. And it connotes risk—because it is sometimes dangerous to embark on such a journey.

Equipoise is a state of hopeful ignorance, the quiet before the storm of discovery.

Author, The Most Human Human; Coauthor (with Tom Griffiths…

Computer scientists speak of the “explore/exploit” tradeoffbetween spending your energy experimenting with new possibilities and spending it on the surest bets you’ve found to dateOne of the critical results in this area is that in problems of this type, few things matter so much as where you find yourself on the interval of time available to you.

The odds of making a great new discovery are highest the greener you are—and the value of a great discovery is highest when you’ve got the most time to savor it. Conversely, the value of playing to your strengths, going with the sure thing, only goes up over time, both as a function of your experience and as a function of time growing scarce. This naturally puts all of us, then, on an inevitable trajectory: from play to excellence, from craving novelty to preferring what we know and love. *The decision-making of the young—whether it’s who to spend time with, where to eat, or how to work and play—really should differ from the decision-making of the old.

huge because we’ve perpetuated a system to suck out the benefits of neotony.. so we get neither of *these..

Artist; Composer; Recording Producer: U2, Coldplay, Talking…

The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway.