intro’d to Robert here:
via fb share by Sugata
Jay found this. I think this article is very important. Whether the premise is right or wrong. It is also… fb.me/5UXxftn15
The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.
Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.
Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to change rapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.
Senses, reflexes and learning mechanisms – this is what we start with, …
But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.
We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.
Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’).
Computers, quite literally, move these patterns from place to place in different physical storage areas etched into electronic components. …..The rules computers follow for moving, copying and operating on these arrays of data are also stored inside the computer. Together, a set of rules is called a ‘program’ or an ‘algorithm’. A group of algorithms that work together to help us do something (like buy stocks or find a date online) is called an ‘application’ – what most people now call an ‘app’.
Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.
Each metaphor reflected the most advanced thinking of the era that spawned it. Predictably, just a few years after the dawn of computer technology in the 1940s, the brain was said to operate like a computer, with the role of physical hardware played by the brain itself and our thoughts serving as software. The landmark event that launched what is now broadly called ‘cognitive science’
Propelled by subsequent advances in both computer technology and brain research, an ambitious multidisciplinary effort to understand human intelligence gradually developed, firmly rooted in the idea that humans are, like computers, information processors. This effort now involves thousands of researchers, consumes billions of dollars in funding, and has generated a vast literature consisting of both technical and mainstream articles and books.
The information processing (IP) metaphor of human intelligence now dominates human thinking, both on the street and in the sciences. There is virtually no form of discourse about intelligent human behaviour that proceeds without employing this metaphor, just as no form of discourse about intelligent human behaviour could proceed in certain eras and cultures without reference to a spirit or deity. The validity of the IP metaphor in today’s world is generally assumed without question.
But the IP metaphor is, after all, just another metaphor – a story we tell to make sense of something we don’t actually understand. And like all the metaphors that preceded it, it will certainly be cast aside at some point – either replaced by another metaphor or, in the end, replaced by actual knowledge
Specifically, her brain was changed in a way that allowed her to visualise a dollar bill – that is, to re-experience seeing a dollar bill, at least to some extent.
Chemero and others describe another way of understanding intelligent behaviour – as a direct interaction between organisms and their world.
not algorithmic.. not predictable.. so best to facilitate/listen to map.. already in each one..
Because neither ‘memory banks’ nor ‘representations’ of stimuli exist in the brain, and because all that is required for us to function in the world is for the brain to change in an orderly way as a result of our experiences, there is no reason to believe that any two of us are changed the same way by the same experience.
Those changes, whatever they are, are built on the unique neural structure that already exists, each structure having developed over a lifetime of unique experiences.
This is perhaps the most egregious way in which the IP metaphor has distorted our thinking about human functioning. Whereas computers do store exact copies of data – copies that can persist unchanged for long periods of time, even if the power has been turned off – the brain maintains our intellect only as long as it remains alive.
Think how difficult this problem is. To understand even the basics of how the brain maintains the human intellect, we might need to know not just the current state of all 86 billion neurons and their 100 trillion interconnections, not just the varying strengths with which they are connected, and not just the states of more than 1,000 proteins that exist at each connection point, but how the moment-to-moment activity of the brain contributes to the integrity of the system.
In a recent op-ed in TheNew York Times, the neuroscientist Kenneth Miller suggested it will take ‘centuries’ just to figure out basic neuronal connectivity.
perhaps that’s not something doable.. and perhaps it’s not even helpful .. if we could.. because if we figure outa pattern.. then in essence… brain is dead..
like what a raised eyebrow does to a person..
in critical response to above article – yes brain is computer.. by Jeffrey Shallit
Here are just a few of the silly claims by Epstein, with my commentary:
“But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently.”
— Well, Epstein is wrong. We, like all living things, are certainly born with “information”. To name just one obvious example, there is an awful lot of DNA in our cells. Not only is this coded information, it is even coded in base 4, whereas modern digital computers use base 2 — the analogy is clear. We are certainly born with “rules” and “algorithms” and “programs”, as Frances Crick explains in detail about the human visual system in The Astonishing Hypothesis.
“We don’t store words or the rules that tell us how to manipulate them.”
— We certainly do store words in some form. When we are born, we are unable to pronounce or remember the word “Epstein”, but eventually, after being exposed to enough of his silly essays, suddenly we gain that capability. From where did this ability come? Something must have changed in the structure of the brain (not the arm or the foot or the stomach) that allows us to retrieve “Epstein” and pronounce it whenever something sufficiently stupid is experienced. The thing that is changed can reasonably be said to “store” the word.
“Computers do all of these things, but organisms do not.”
— No, organisms certainly do. They just don’t do it in exactly the same way that modern digital computers do. I think this is the root of Epstein’s confusion.
engineering.mit.edu/…/can-computer-generate-t…Massachusetts Institute of Technology
Nov 1, 2011 – The results may be sufficiently complex to make the pattern difficult to identify, but because it is ruled by a carefully defined and consistently repeated algorithm, the numbers it producesare not truly random. “They are what we call ‘pseudo-random‘ numbers,” Ward says.
2016 – Google Will Steal This Election & How – Dr. Robert Epstein Interview [ep 7]
i have more reading to do here
Robert Epstein (born June 19, 1953) is an American psychologist, professor, author, and journalist. He earned his Ph.D. in psychology at Harvard University in 1981, was editor in chief of Psychology Today, a visiting scholar at the University of California, San Diego, and the founder and director emeritus of the Cambridge Center for Behavioral Studies in Concord, MA.
Epstein is also a scholar in the field of psychological maturity, and once published an online maturity test. He is a strident critic of what he sees as the “artificial extension of childhood” over the past century, arguing that what society sees as the “teen brain” is often the result of Western cultural factors and infantilization, rather than a set of brain characteristics that are inherent in all humans throughout their teen years. In certain essays, he has cited studies which found that some teenagers are in some ways more developmentally mature than most adults, and advocates giving young people more adult responsibility, as well as placing them in environments in which they will not be prone to socializing simply with other teenagers.
indeed – artist ness