For most of the 20th century, business leaders relied on “scientific” studies and “statistical significance” to determine what information they could trust. Now, technology is making those assumptions obsolete and the practice of management will never be the same.
In business, we often run into very similar problems. We need to make decisions based on incomplete information in a rapidly changing context. So not surprisingly, Gauss’s work has formed the basis of many of the statistical techniques that modern day management employs, such as regression analysis, to make sense of a messy world.
Before long Fisher’s methods were adopted by business, culminating in the Six Sigma movement that purported to achieve stable and predictable results. Much like Fisher’s earlier efforts, it was thought that by controlling every aspect of the process, uncertainty could be tamed and management could be transformed from an art into a science
Rules For Control
Underlying his methods was an emphasis on controls. Put good data in and you would get good answers out.
Yet all was not well. Many, Nassim Taleb in particular, argued that control was a dangerous illusion. Anything that met a basic standard of statistical significance (usually 95% confidence) was treated as fact. False certainty led managers to discard inconvenient information as “outliers,” often with disastrous results.
Hovering in the background all this time was an alternative approach called Bayesian inference, which allowed you to simply make a guess and then revise your judgement as new information came in. It was, in many ways, the polar opposite of Fisher’s approach. No specific controls, no rules about significance, just an updating of probabilities.
Although Bayesian methods were successful in some important cases where controlled studies weren’t an option, such as hunting German subs during World War II, they weren’t widely deployed. Part of the reason was that Fisher and his followers fought hard against them, but mostly it was because they were impractical. It was hard to gather enough data to make them work.
That’s what big data is starting to change. The combination of accelerating returns in storage and processing power, along with a sea of data from the Web of Things and increasingly efficient algorithms, are making Bayesian methods not only practical, but faster, cheaper and more accurate than the traditional approach.
Like any new technology, there is a lot of confusion surrounding big data. There are endless debates about what is and isn’t big data, armies of consultants who are eager to muddy the waters in return for a hefty retainer fee and the usual amount of hype and alphabet soup of acronyms and buzzwords.
But what you really need to know about big data is this: It represents a fundamental shift in how we do things. In effect, big data opens the door to a Bayesian approach to strategy where…
we no longer try to be “right” based on controlled research and small samples, but rather become less wrong over time as real world information floods in.
unless perhaps we do the unthinkably ridiculous… and take money (in whatever form) out of the picture entirely. it seems it would take a thunderclap of sync in an entire ecosystem.. which could then perpetuate – quite rapidly – to the entire world.
sunday, august 5, 201
want to look through these..Indy Johar (@indy_johar)
great wrk by @Digitaltonto congrats! “@timekord: Collection of great posts by @Digitaltontovsb.li/kY0wxo – who is 3 years old today”
saturday, august 18, 2012
sunday, august 26, 2012
|Saul Kaplan (@skap5)
Disruption is relative. Nice post by @Digitaltontoow.ly/1OsOdtJose Baldaia (@Jabaldaia)
RT @Digitaltonto: 5 Super Cool Future Technologiesp.ost.im/p/dRsVSfGreg Satell (@Digitaltonto)
RT @mylesbristowe: “I know why the average lifespan of a CMO is only about 22 months.” gag.gl/wmHrvery good points here_______________
tuesday, november 27, 2012
now imagine .. we unleashed the *impoverished in school bldgs. everywhere, ny, la, Kampala, Jackson, Vegas, Perth, …
and come to find out… every one of them has some neglected genius. every one.
now innovation, crazy, unthinkable, … is the norm.
isn’t that how learning goes..? authentic learning, by every person ..unleashed to their authenticity.
*ie: those labeled unfit.. or labeled.. exceptional, exceedingly bright…
yet if you listen to each one of them… each one… they are craving more.
imagine.. city as school… where no genius hides or dies, because we’re finally listening to every soul..
then, perhaps, we cease to create/manufacture problems we seem to have no time to solve.. and in good health, we do/ be/become work/play that matters.
I think we could do that on a multi trillion dollar budget. no?
problem today… we’re using that money to perpetuate and /or bandaid-ize problems, rather than not create them in the first place.
how to get from here to there…?
we have brilliant people in school blds, 7 hrs a day, 5 days a week… they could pull us out… back to us… if we set them free.. even just for one year.
we should try.
we should not not try.
wednesday, june 27, 2012
Greg Satell (@Digitaltonto)3 Fundamental Shifts in the Basis of Competitionp.ost.im/p/egMU8kincredible insight.. want to dig into the links more..
absolutely love – that it’s more about emergence than deduction..
that rings of perpetual beta to me.. prototyping.. doing.
becoming – ongoingly.
the iterations creating the emergence – as the gold.Greg Satell (@Digitaltonto)
The Changing Game of Strategyp.ost.im/p/esp3a8love the quote from zuckerberg in this one:
it’s not about working to get the money – it’s about getting the money to work.smacks of gansky’s mesh – future of business is sharing.talking with kids – some of them see the term fundraising – as truly – time/talent share.. where most of us older people see it as selling butterbraids or whatever – things we don’t really need – and you don’t really get all the money for.we are in a hotbed currently.. the timing is just so.. the tech, the youth, the dissatisfaction – we can turn this on a dime if we start asking the right questions.. if we start being about freeing people up – to start using their heads.. thinking for themselves… being mindful.. being..no?thank you greg.._________
Platforms Are Eating The World http://t.co/eUTQLUXLTC
Original Tweet: https://twitter.com/Digitaltonto/status/556477912682340352
thinking of this while reading commons transition.. and wondering about replacement vs abandonment.. of ie: policy, thinking, platforms..
july 2015 – on america as exceptional
Normally, a letter from some immigrant scientists would not reach the President’s desk, but Einstein’s signature carried a lot of weight. The President ordered the idea studied and determined that it required action. As luck would have it, an engineer from MIT, named Vannevar Bush, was also in the process of selling Roosevelt an idea.
Bush’s brainchild was the OSRD (Office of Scientific Research and Development), which would capitalize on America’s newfound talent to conduct scientific research to support the war effort. Bush would run it and report only to the President. His proposal was approved and given almost unlimited resources and funding.
The OSRD was an unparalleled success. In addition to the Manhattan Project that built the atomic bomb, it also developed a number of other innovations that contributed greatly to the war effort, including the proximity fuze and radar. Perhaps most importantly, it forever changed how science was funded and undertaken in the United States.
perhaps how we could/should treat Neil g ness et al.
but also… as the day
Today, virtually every digital device in the world is based on the von Neumann architecture and America dominates the global market for digital technology, estimated to be worth nearly $4 trillion a year. We are also leaders in every other advanced area conceivable, from medical research to nanotechnology to energy.
Throughout our history it has been our openness—to new people as well as new ideas—that made it all happen.
You can see why Lepore is concerned. A “pack of ravenous hyenas” wildly intent on “breaking shit” certainly does sound menacing, especially one empowered by a staid Harvard professor. Visions of the 60’s counterculture abound. However, Lepore does Christensen’s work a tremendous disservice.
A more thorough reading of his work would reveal that he wasn’t, in fact, advocating for the destruction of the corporate order, but trying to save it. His research showed that once-successful firms often failed not because they lacked competence or conscientiousness, but because were operating according to a defective model.
In essence, Christensen’s point was that businesses that fail are often not the feckless bumblers they’re made out to be. Rather that by diligently following the precepts of an incomplete model, they fall prey to assumptions that do not apply. In other words, he was helping managers recognize shifts in the marketplace and to make better judgments about how to respond.
So Christensen’s work was in no way an attack, but in fact an acknowledgement of the shortcomings of his own profession. He argued that by diligently following precepts of incomplete models taught in business schools, managers often fall prey to assumptions that do not apply.[..]
Today, as Moisés Naím points out in The End of Power, we see a great many disruptions today in politics, religion, and military affairs. At first glance these may seem to have little to do with disruptive business models, but if we look closer we find that many of the same forces are at work: small groups, loosely connected and united by a shared purpose can challenge even the most powerful among us.
In the past, bureaucratic institutions played a crucial coordinating role. It was through their vast control of assets that we were able to mobilize resources on a massive scale. Large institutions dominated because they could do what others could not. Yet now it is not control of resources that is important, but access to them.
Digital technology enables relatively small actors to synchronize their actions through networks. That, in a nutshell, is how disruption happens. It is also why we see disruption happening with increasing frequency, all around us. Power, as Naím puts it, has become easier to get, but harder to use or keep.
ok. so reading naim‘s book
Consider the case of the Occupy movement, which sought to disrupt the existing order by advocating for the “99%” against the “1%.” Yet despite the populist theme, their rhetoric and tactics were far too extreme for most people. Perhaps not surprisingly, the movement quickly died out, although its basic theme about inequality lives on.
Now look at Otpor, a similar movement that overthrew Milošević in Serbia and served as the prototype for modern political revolutions across Europe, the Middle East and Central Asia. They designed their protests to appeal to the masses and garner support. Instead of insisting on ideological purity, they celebrated diversity in approaches to shared values and purpose.
a nother way… 7 bill in charge of the day
Successful disruptors might break old models, but they build better ones that benefit us all, which is why we embrace, rather than fear them
So what really needs to change is not how we describe our organizations, but the role of leaders within them. Whereas before, it was the role of managers to direct work, in a connected age we need to instill passion and purpose around a shared mission. The networking, if encouraged and not inhibited, will take care of itself.
It is through learning the stories of technology that new chapters are written.
[..]We don’t revere Steve Jobs because he “made” technology, but for what he revealed that no one else saw. While others would see features, he saw stories about the people who use technology.[..]Technology, when properly understood, is far more than the product of algorithms, microscopes, test tubes and other apparatus, but the revealing of truths in the service of human life.
..now it is not control of resources that is important, but access to them.
Consider the case of the Occupy movement, which sought to disrupt the existing order by advocating for the “99%” against the “1%.” Yet despite the populist theme, their rhetoric and tactics were far too extreme for most people. Perhaps not surprisingly, the movement quickly died out, although its basic theme about inequality lives on.
oh the irony… 99 and 1..is extreme… about most people…
Apple strives to make products us that are “insanely great.” Google wants to “organize the world’s information.” Tesla aims to “accelerate the advent of sustainable transport.” …
So the question is not whether disruption itself is good or bad, but disruption in the service of what? …Successful disruptors might break old models, but they build better ones that benefit us all, which is why we embrace, rather than fear them
jan 2016 – on accelerating innovation
For most of its history, the United States was a backwater. At the turn of the 20th century, a promising young student would often need to go to Europe to get an advanced degree in science from a world class institution. Perhaps not surprisingly, inventions that drove the industrial age, the steam engine, internal combustion and electricity all came from Europe.
The balance began to tip in the runup to World War II. As the backlash to “Jewish physics” in Europe grew, top minds like Einstein, von Neumann and Fermi migrated to the US. It was our openness—to new people and new ideas—that made America exceptional. With Europe in turmoil, America attracted the greatest collection of scientific talent the world had ever seen.
and prior to… needing academia, govt/public, and private corp/capital to come together… which often takes yrs
In 1940, after Germany invaded France, Vannevar Bush went to President Roosevelt with a vision to combine government funding with private enterprise to mobilize America’s latent scientific resources. Under his plan, public grants for defense research would be issued to private institutions in order to accelerate progress.
Bush’s plan was, of course, a historic success and, as the war was coming to a close, Roosevelt asked him to issue a report recommending how the wartime efforts could be replicated in peacetime. That report, Science, the Endless Frontier, became the blueprint for America’s technological dominance.
It was a unique vision. An economist might say that Bush was addressing the problem of appropriability. The benefits of basic science, which have no immediate application, can only be appropriated by society as a whole. However, the practical applications that discoveries make possible represent a clear profit motive that is best pursued by private enterprise.
Yet still, despite the success of the Bush architecture, it’s hard for many to get their heads around it. It lacks strategy and direction. That’s probably why other countries have consistently gone another way.
To understand, the how different the United States is, it’s helpful to look at how computer technology was developed. For example, the first digital computer was not, as many believe, invented in the US, but in the United Kingdom during World War II. Alan Turing, a British scientist, is still considered the father of modern computing.
However, considering the machine a military secret, Winston Churchill ordered it destroyed. Then they locked Turing away in a physics lab to work in relative isolation. Later, after a conviction for homosexual acts, he had to endure a harsh sentence of chemical castration, which led to his suicide. That’s how the British killed their technology industry.
The French, for their part, recognized the value of computer technology and launched a national effort in the sixties called Plan calcul to develop advanced mainframes. The Japanese, through it’s legendary Ministry of International Trade and Industry (MITI), invested heavily in semiconductors. Both programs had early successes and then faded away.
The American approach was far different. Government funding lead to the IAS machine, but the technology was widely shared and largely driven by industry. Later government grants helped lead to microchips, the Internet and other advances, but the application of those discoveries were largely left up to entrepreneurs, rather than bureaucrats. The results speak for themselves.
whoa to history.. and to turing
So we desperately need to update the initial vision. We can no longer assume that we can separate things into neat little categories like basic and applied research and hope that they meet up somewhere down the line. At the same time, with the world moving as fast as it does now, private industry needs help to stay abreast of it all.
Perhaps most of all, we need to recognize that people like Steve Jobs and Elon Musk don’t succeed on their own. Today, we live in a world of the visceral abstract, in which our most important products come from the most unlikely ideas. While we lionize our great entrepreneurs—and rightly so—we cannot neglect the science that makes them possible.
The results are clear. We now rank 25th and 17th, respectively, in math and science. We are 34th in infant mortality, 38th in life expectancy and 16th in higher education. We’ve fallen to 4th in manufacturing competitiveness and 22,000 Americans die annually due to lack of health insurance. These are facts, they are undeniable and they matter.
using some of these metrics as metrics is perpetuate\ing not us ness..
The modern world is a place where ideas are far more than mere personal beliefs, they affect our lives. Ignorance is not a condition, but a willful choice and it has its price.
imagining a nother way ..
@digitaltontoMarketers Need To Shift From Collecting Eyeballs To Building Interfaces – digitaltonto.com/2016/marketers…
The One Thing Nobody Ever Tells You About What It Takes To Succeed – digitaltonto.com/2016/the-one-t…
Another way in which IBM collaborates is through its longstanding commitment to open technology. While it often develops new technology on its own, it actively participates in open source communities like Linux and Apache, so that tens of thousands of outside developers can improve on them, freeing up resources that IBM can then focus on solving its clients problems directly.
The company also often contributes patents to open source foundations, to protect the technology. So IBM’s patent leadership has both an offensive and defensive function. It helps create new businesses, but also helps give it freedom of action and avoid patent infringement dustups like the one that’s been raging between Apple and Samsung.
When you start counting patents in the thousands, it’s easy to get lost in the numbers, but the important thing to remember is that behind each patent is a new discovery that has the potential to lead to a new business. “We want to define the future, rather than become a victim of it,” says Meyerson.
That’s easier said than done, but history would seem to be on their side.
then … history…?
that’s why inequity still exists… even more distinct…?.
from link embedded above:
Alas, it wasn’t a completely happy story for Big Blue. One of the companies that IBM outsourced to, Microsoft, was able to dominate the industry through its control of the operating system. While IBM created the PC revolution, it was Microsoft that ended up taking the bulk of the profits.
That was a hard lesson, but IBM seems to have learned it.
under heading – evolution of big data:
Data, it’s been said, is the plural of anecdote. People naturally catalogue specific events until they become noticeable trends. Unfortunately, the scope of human experience is relatively small. We can only be one place at a time and there are only 24 hours a day. So it’s hard for a manager of an office in Boston to know if his colleague in San Diego sees what he sees.
That’s essentially the problem that computerized databases are designed to solve. At first, they could only store fairly simple information. Yet soon, more sophisticated relational databases were developed that could retrieve and analyze data much more efficiently, enabling data mining and more sophisticated analysis. Still, challenges remained.
One was that data needed to be housed in a single location. Another was that while databases worked well with information that was formally structured, like point of sale records, they couldn’t handle unstructured data, like Word documents or scientific papers. In effect, anecdotes could only become data if they were properly processed.
Hadoop, an open source technology created in 2005, solved both of these problems. It allows us to store and pull data, whether structured or unstructured, from many locations at once. That’s what enabled the era of big data, creating an integrated environment from which events—or anecdotes if you will—can be widely aggregated and analyzed.
fine… unless data itself is not real.. esp if it’s esp not real
ie:from… science of people in schools et al
this (embedded) article from 2015 – under heading – os for data:
As transformational as Hadoop has been, issues remain. Although it is an incredibly effective filing system for storing data, it is less adept at retrieving it. To analyze information in Hadoop, you must go through the entire data set, which takes time. Further, you can’t analyze data continuously, only in batches.
That’s a real problem for machine learning. Imagine if you had to go through an entire day and only be able to process experiences when you got home. Without the ability to continuously process —in effect to turn anecdotes into data to be analyzed—you couldn’t react to changes in your immediate environment. Every decision would be a day late.
Spark, an open source technology created at UC Berkeley in 2009, effectively solves that problem. It can pull relevant data in from Hadoop, or any other source, and analyze it continuously, updating insights in real time. That makes it a boon for machine learning. Just like a human, Spark allows systems to see the world as it happens and update analysis.
That’s why Rob Thomas, a Vice President at IBM and author of Big Data Revolution, calls Spark an operating system for data. Much like Microsoft’s operating system for the PC, it allows machine learning systems and analytical engines to pull resources from anywhere in the system and deliver those resources to applications.
Put another way, the data revolution is like the PC revolution all over again. Except this time, instead of Microsoft, we basically have the equivalent of Linux—an open source, rather than a proprietary, operating system.
biggest problems – feb 2016
In 1935, the year Social Security was enacted, life expectancy in the US was little more than 60 years. Today, it stands closer to 80. Global trends are similar. Even in Africa, life expectancy has gone from 36 in 1950 to 58 today. That of course, is a very good thing, but it is also creating a new set of challenges quite unlike anything we’ve faced before.
First, the obvious. Longer lifespans contribute to population growth, which causes strains on the environment and increases the risk of climate change. The UN predicts global population will top 9 billion people by 2050, which may well exceed the planet’s carrying capacity. So we’re going to have to markedly reduce the environmental impact of each person.
cool off the chip..
using waste to cool…
Another impact is on healthcare. A generation ago, the medical profession was focused on acute events, like heart attacks and car accidents. Yet today, an increasing share of healthcare spending goes toward the more chronic conditions associated with an older population, such as cancer, diabetes and Alzheimer’s, which are far more costly to treat.
Finally, increasing lifespans will cause the worker to retiree ratio to plummet. The McKinsey Global Institute warns that falling birth rates combined with increasing lifespans will result in a demographic deficit, limiting our ability to produce capital for investment to solve future problems we will face.
Even taking terrorism out of the equation, the fact is that over the next few decades we are going to see literally billions of people enter civil society who, through the Internet, will be politically aware and far more demanding than their predecessors. We’re going to have to find ways to make room for them and their demands.
These challenges are not in any way insurmountable. Surely, we’ve overcome much greater ones in the past. Yet they are fundamentally different in nature and will require vastly different approaches than before. We can’t solve them with greater prosperity, technology or education because those are, in large degree, the underlying causes.
As Rishad Tobaccowala has put it, the future does not fit in the containers of the past. The challenges the next generation will face are unlike anything we’ve ever had to deal with before. We will need to come up with new approaches that are not only innovative in their conception, but collective in their execution.
In other words, the most profound problems we face today will not be solved through economics or even by technological advancement, but through the political process—both globally and domestically. Unfortunately we haven’t even really begun.
dator ridiculous law et al
feb 2016 – on genius of recognizing genius
All too often, we miss out on opportunities not through a lack of intelligence, but a lack of imagination.
We are simply focused on doing other things and, when presented with an idea that doesn’t fit with the particular problem on our mind at the time, we fail to take note. The truth is, seemingly useless things can turn out to be useful indeed.
mar 2016 – to adapt you need to evolve
When scientists decoded the human genome in 2001, they found something astounding. While our DNA provides the blueprint for everything about us—from how we develop in the womb to eye color and personality traits—it takes only 20,000 genes to do so, less than one fifth of what had previously been thought.
What was even more mindblowing was the reason that they had been so off the mark. While our genome would seem to be the model of efficiency, squeezing all that information into a microscopic nucleus, 98% of our DNA is “junk”that doesn’t code for anything. How could our biology be so wasteful?
In The Selfish Gene, the eminent biologist Richard Dawkins explains that the confusion arises because we assume that DNA exists for our sakes rather than the other way around. We, he argues, are mere vehicles to propagate genes.
Unfortunately, where McColough saw recombinant DNA that could help his enterprise adapt, the rest of the organization saw an antigen—a foreign species that threatened the biology of the organism—and rejected it.
We tend to think of ourselves as one homogenous unit with DNA at the center, but that’s not really true. In fact, our bodies house ten times as many bacterial cells as human ones. Scientists have very little idea what all those little guys are doing, but clearly a lot of it is essential to keep our bodies functioning well.
We call non-coding DNA “junk” because doesn’t contribute to our body’s function. Some of it plays a regulatory role, but most of it just sits there, surviving because of the work of its neighbors. When cells prosper and grow, they divide, propagating the essential genes along with all of the other random code.
Yet scientists are beginning to suspect that our junk DNA performs an evolutionary function. All that raw material sitting in our chromosomes is, in effect, a genome in waiting, ready to help us adapt to changes in our environment. Sure, it doesn’t do much for us now, but our junk DNA is just a few random mutations away from becoming something important.
perhaps entropy ness as protection program
we need to listen to what it is telling us.
“The next big thing will start out looking like a toy,” writes Chris Dixon. In other words, it’ll look like junk.
march 2016 – science behind political correctness
Political correctness, all too often, is in the eye of the beholder. One person’s empathy is another’s insensitivity, or so it would seem. But whatever your opinion of the merits and demerits of political correctness, it is, at least in part, a technological phenomenon. So perhaps instead of the usual vicious cycle of recriminations, we should look for deeper roots.
or.. perhaps just spend our energy living deeper roots.. ie: a and a.. no other rules..
Clearly, political correctness is a product of that cultural *collision. Arguments that cannot be decided by rational debate—and cultural beliefs never are—can only be won by cultural dominance. What makes the stakes even higher is that, as we will see, majorities don’t merely legislate they also influence, making activism around cultural issues a dire political necessity.
unless.. we free ourselves up to change all that. ie: no more need to collide..
Clearly, we are heavily influenced by our immediate surroundings, but more recent research has found that the effect extends to three degrees of social distance…….Asch also showed that including one or two dissenters in the group drastically brought down the pressure to conform. So the practical implications of political correctness are immense. If what we see as moral and appropriate can be affected by the views prevalent in our surroundings, silencing even minor dissent becomes hugely important
So to propagate any further, it is absolutely essential to silence dissent.
So the political correctness game must be played subtly. Dissent must be discouraged, but not bludgeoned in such a way that perpetrators appear to become victims. I
3 Paradigm Shifts That Will Drive How We Compete In The 21st Century – digitaltonto.com/2016/3-paradig…
great post/insight… but perhaps missing ginormously small element.. what if the drive ness comes from disengaging from competition..
@digitaltontoThese 4 Major Paradigm Shifts Will Transform The Future Of Technology – digitaltonto.com/2016/these-4-m…
1\ chip to system
Yet Moore’s law is now nearing its end. The problem is twofold. First, there are only so many transistors you can squeeze onto a chip before quantum effects cause them to malfunction. Second, is the problem known as the von Neumann bottleneck. Simply put, it doesn’t matter how fast chips can process if they need to wait too long to communicate with each other.
So we have to shift our approach from the chip to the system. One approach, called 3D stacking, would simply combine integrated circuits into a single three dimensional chip. This is harder than it sounds, because entirely new chip designs have to be devised, but it could increase speeds significantly and allow progress to continue.[..]
2\ applications to architectures[..]Soon, when we choose to use a specific application, our devices will automatically be switched to the architecture—often, but not always, made available through the cloud—that can run it best.[..]
3\ products to platforms[..]Platforms are important because they allow us to access ecosystems. Amazon’s platform connects ecosystems of retailers to ecosystems of consumers. The App Store connects ecosystems of developers to ecosystems of end users. IBM has learned to embrace open technology platforms, because they give it access to capabilities far beyond it own engineers.[..]
4\ bits to atoms[..]When progress is powered by chip performance and the increased capabilities of computer software, we tend to judge the future by those same standards. What we often miss is that paradigms shift and the challenges—and opportunities—of the future are likely to be vastly different.
@DigitaltontoBitcoin May Not Survive, But The Technology Behind It Will Live On – digitaltonto.com/2016/bitcoin-m…
One way to solve the Byzantine Generals Problem is by establishing a trusted third party, which is the role that governments and other institutions traditionally play in financial transactions. Yet a third party is not a true solution, because there’s always the possibility that the third party can be corrupted as well. In effect, it merely assumes away the problem.
Blockchains solve the trust protocol by creating a distributed ledger, almost as if the generals could constantly refer to an encrypted Reddit post that they all had access to. What makes the technology so exciting is that there is a wide variety of areas beyond financial transactions where trust is important.
so first para… we go ginormous small and trust each individual… in the moment…
but huge key.. that many miss/fear… is that we are trusting humanity.. ie: are you human..?.. ok.. i trust you… not ie: here’s the measurement/validation of that transaction… trust that
perhaps our hearts were made to trust people.. but not so much to trust measurement… that’s man-made… and very subjective.
also.. transaction cant tell us their story.. even on ledgered blockchain… because its algo…
but.. people can tell us their story.. (if we care enough to listen… otherwise… no grounds to judge.. so save energy and just carry on.. trusting that people are good)
i know you ness.. is about people… as is.. enough… not about measuring transactions
Traditionally, one major way that we have legislated trust in our society is through contracts, which codify obligations, penalties and the jurisdiction whose laws will enforce the agreement. These can be incredibly cumbersome documents, often running to hundreds of pages.
Consider the case of a building project, in which a general contractor must sign agreements with hundreds of subcontractors. To enforce these contracts, work must be inspected and if it passes muster, it goes to an accounts payable department, which authorizes payment and instructs a bank to wire the proper amount. The process usually takes at least a few weeks.
However, a smart contract powered by a blockchain can streamline the process through automation. Using a simple tablet computer, an inspector can instantly activate the smart contract, which has all of the provisions of agreement embedded within it, to arrange settlement, including payment.
*share – @jhagel: Just the beginning – global law firm announced it has hired a robot lawyer to work in the firm’s bankruptcy practice for.tn/1Wwrywt
and this ..
*share – Racist AI putting people in prison. Superb reporting by @JuliaAngwin @ProPublica https://t.co/qSj694uFKM
Original Tweet: https://twitter.com/trevorpaglen/status/735150486865608704
the tech we have means.. time to et go… and leap…
This is just one example, but the possibilities are endless. Another exciting areas is rights management for intellectual property. By integrating settlement, enforcement and other aspects of agreements with smart contracts, we can not only create enormous efficiencies, but also make our lives a lot easier.
Blockchain technology has the potential to bring the hiring process into the 21st century by embedding authentication within the resume itself.
Blockchains can greatly improve on passwords because they represent an entire digital identity. So a hacker would have to create an entire history of legitimate activity in order to gain authorization for access. That wouldn’t be impossible, but it would be much harder to do and a clean blockchain would become tainted after a single intrusion.
As Alex Tapscott has put it, The Internet of Things needs a Ledger of Things.
As it turned out, Hardy was one of the only people on the planet who could. It took him some time to go over the papers, and even more time to get over his utter disbelief, but by the end of the day he recognized that Ramanujan, despite his obscurity and complete lack of credentials, was one of the greatest mathematical minds who ever lived.
In effect, we all have a lesser version of Ramanujan’s problem in that *we need recognition to get things that we want. Up till now, that’s been done by centralized institutions, such as banks, credit bureaus, and universities, all of which wielded enormous power. The Internet has helped to decentralize many important functions, but in many ways it also eroded trust.
That, in a nutshell, is the **role that blockchains, in the form of a trust protocol, can play in the years to come.
The only real path forward is to define the problems you seek to solve and build your own innovation playbook.
And Tim Cook, Apple’s CEO, has very clear ideas about what it takes to create breakthrough products. “It’s people who care enough to keep thinking about something until they find the simplest way to do it,” he says. “They keep thinking about something until they find the best way to do it.” Sounds like good advice.
He also has very clear ideas about what not to do, such as creating innovation labs, which he thinks is a really bad idea, going as far as to say, “A lot of companies have innovation departments, and this is always a sign that something is wrong when you have a VP of innovation or something. You know, put a for-sale sign on the door,” he says.
That sounds like it makes sense, but then you look at Google and that’s exactly what they’ve done with Google X. Microsoft and IBM also have research divisions and have successfully innovated for decades, across multiple technology cycles. Apple, meanwhile, still relies on the iPhone, launched in 2007, for roughly two thirds of its revenue.
revenue as success measure… any measure for that matter… won’t yield us … a nother go.. everyday… for all… ( which might just be the way)
Henry Ford offered cars in any color “as long as it’s black.” Steve Jobs, that “A lot of times, people don’t know what they want until you show it to them.” Many great innovators are iconoclasts, who forge their own path. After all, how can you create something truly new by asking people their opinions about what exists today?
Yet IBM’s Chief Innovation Officer, Bernard Meyerson, believes customers are an important part of the innovation process. “You’re never certain as to what’s going to be commercially fantastic,” he told me. “That’s why we take an unconstrained approach to research and innovation. We want to know about everything that can help us solve a problem.”
“Our customers can’t tell us about a future that doesn’t exist yet,” he continued. “But they can tell us about unresolved problems and we can get to work on them. Addressing a really grand challenge like Watson can begin 5 or 10 years before the result is seen in public. It was a science project, but with business problems in mind.”
These widely divergent views create a dilemma for anyone looking to innovate. Henry Ford and Steve Jobs built revolutionary new products. IBM has innovated for over a century, consistently creating new businesses to replace old ones that run out of steam. All, in their own way have been enormously successful. So which should we follow?
again…biggest problem is that our jaded lens is: commercially fantastic.. et al
people do decide 7 bill plus diff ways.. & changing those 7 bill plus ways … ongoingly… if… set 100% free
The tricky thing about disruptive innovations is that they rarely fit into existing business models and so the value they create isn’t immediately clear. Kodak made money by selling film, so was slow to adopt the digital cameras that the company had itself invented. Yahoo’s business was focused on keeping users on its site, so passed on the chance to acquire Google.
It’s not just products that we have to innovate, but business models as well
perhaps today… business model as irrelevant
The premise of the rule is simple. Focus 70% of your resources in improving existing technology (i.e. search), 20% toward adjacent markets (i.e. Gmail, Google Drive, etc.) and 10% on completely new markets (i.e. self-driving cars).
In a networked world, the surest path to success is not acquiring and controlling assets, but widening and deepening connections.
That’s why now collaboration itself is becoming a competitive advantage.
Increasingly, we’re finding that to solve really tough problems, we need to work harder to integrate people with diverse talents.
Take a slightly broader view and it becomes clear that innovation today goes far beyond research labs, Silicon Valley pitch meetings and large corporate initiatives. We all have something to offer and can add to the world’s knowledge in a way that may differ in degree, but not in kind, to the giants of the past.
The truth is that some of the problems we face today are simply too big and complex to be solved by any one organization.[..]valley of death – when there is promise but no proof[..]
Every year, tens of thousands of papers are published in scientific journals and any one of them, potentially, could hold the key to the next big thing. But for a private firm to invest millions of dollars in a new idea, there has to be more to go on.
Another problem is that the research institutions themselves—government labs and research driven universities—have become so large that they’ve become notoriously hard to navigate. At the same time, the marketplace has become so fiercely competitive—and investors so demanding—that few are willing to take a flyer on an unproven technology.
However, the problems we need to solve today are far more complex than before. The journalNature recently noted that the average scientific paper today has four times as many authors as one did in 1950 and the work they are doing is far more interdisciplinaryand done at greater distances than in the past. That greater complexity means that we need to design new approaches to address market failures like the “Valley of Death.”
Many innovative enterprises have learned the value of instilling this type of iterative process across integrated, multidisciplinary teams within their organizations. As it turns out, if we are to solve our biggest and toughest problems, we need to learn how to implement that same level of collaboration across our entire society.
oy.. yes to across entire society.. no thanks to same level of collab..
we can do so much better… if.. we let go..
Greg Satell (@Digitaltonto) tweeted at 5:58 AM – 31 Aug 2016 :
We Need To Switch Our Mental Models From Hierarchies To Networks – https://t.co/YP9txEcheR(http://twitter.com/Digitaltonto/status/770953854502498304?s=17)
Once you start thinking in terms of networks rather than hierarchies, it becomes clear that we must change how we do things; and not just within organizations, but also in how we *approach a competitive marketplace. As it turns out, firms that seek to strengthen industrial networks have a big advantage over those that seek to preserve hierarchies.
*how we approach competitive markets – i’d agree if the approach is to disengage from them..
McChrystal transformed his culture
war is all good..?
not all movements are successful. Consider the case of the Occupy Wall Street. While their powerful rhetoric about “the 99% vs. the 1%” gained them attention, they had no larger plan. What’s more, they weren’t funny or endearing, but angry and provocative. Unlike Otpor, they were back home in a few months. Wall Street still thrives.
occupy was bad..?
perhaps not listening deep enough.. to hearts/souls..
which is ironic with closing para:
In The End of Power, Moisés Naím pointed out that “power is easier to get but harder to use or keep.” That is undeniably true, but I also think it misses the point. The greater truth is that in a world connected by digital technology, power no longer lies at the top of hierarchies, but at the center of networks.
there’s so much unseen power (i’d call it energy.. because it’s not binary.. it can’t be or it won’t work.. a nonrivalrous ness.. nothing to do with wars or getting to the top or money et al) at the center of networks (via occupy ness)
Innovation Needs To Shift From Disrupting Markets To Tackling Grand Challenges – digitaltonto.com/2016/innovatio…
We need fundamental new discoveries to make it all work.
a nother way.. where 7 bill are working/dancing.. on all of it at once..
We need to develop fundamentally new computing architectures
Still, while these show enormous promise, they are still far away from being ready for commercial applications. First, while workable neuromorphic chips exist, quantum computers are still under development, albeit in a fairly advanced stage. Second, because these are fundamentally new architectures, nobody really knows how to design applications for them.
heading: collab is new competitive advantage
competition is key to what’s blocking us from the change our souls crave..
aka: binary ness is killing us
Agility has been the mantra for the digital age. Yet the “iterate, adapt and pivot” model will only take us so far. It’s great for progressing within well known paradigms, but absolutely useless for making the fundamental new discoveries that, as Vannevar Bush put it, “turn the wheels of private and public enterprise.
We need computer scientists working with cancer scientists, with climate scientists and with experts in many other fields to tackle grand challenges and make large impacts on the world.”
Rather than looking for markets to disrupt, we need to look for human endeavors that we can empower.
reimagine the realm of the possible.
Greg Satell (@Digitaltonto) tweeted at 5:45 AM – 17 Oct 2016 :
4 Things Managers Need To Know About Data – https://t.co/l1vWlG3Er9 (http://twitter.com/Digitaltonto/status/787982908212416512?s=17)
an Excel spreadsheet. That’s why Tambay emphasizes the need to break data down into factors that are mutually exclusive and collectively exhaustive (MECE), so that root causes can be identified and dealt with.
A more noteworthy example occurred in 2010, when two Harvard economists, Carmen Reinhart and Kenneth Rogoff, published a working paperthat warned that US debt was approaching a critical level. Their work greatly influenced the political debate around the federal budget but, as it turned out, they had made a simple Excel error and their fears were found to be baseless.
These are not isolated examples. In fact, data errors have led to what scientists are calling a replication crisis, in which many scientific papers are later found to be invalid. Tambay suggests that managers apply a simple “sanity test” to see if the data make sense. For example, in the case of Reinhart and Rogoff, it was clear that many countries ran high debt levels with little or no adverse effects.
on following the papers (why call them working papers..?) .. that got funded..so.. they couldn’t change their minds… only copy a copy a copy….RNA ness
We teach our students to tell stories with data because that’s what’s most likely to affect decision making,” Tambay told me. “And that gets to the core of what we try want our students to achieve — *enable better business decisions through telling compelling data stories.”
let’s go deeper
nov 20 2016 post
teach algos right/wrong
Every parent worries about what influences their children are exposed to. What TV shows are they watching? What video games are they playing? Are they hanging out with the wrong crowd? We try not to overly shelter our kids, because we want them to learn about the world, but don’t want to expose them to too much before they have the maturity to process it.
yet we force them to go to school.. 12+ yrs…which for many… most unsafe place… beyond bullying (all ages.. via all levels of institution)… rips most of their curiosity/shell
Francesca Rossi, a researcher at IBM, points out that we often encode principles regarding influences into societal norms, such as a what age a child can watch an R-rated movie or whether they should learn evolution in school. “We need to decide whether to what extent the legal principles that we use to regulate humans can be used for machines,” she told me.
or perhaps that they’re not good for either
As pervasive as artificial intelligence is set to become in the near future, the responsibility rests with society as a whole. Put simply, we need to treat the standards by which artificial intelligences will operate just as seriously as those that govern our legal systems and how we educate our children.
It is a responsibility that we cannot shirk.
Greg Satell (@Digitaltonto) tweeted at 4:24 AM – 20 Nov 2016 :
Platforms Are Transforming How We Need To Compete – https://t.co/9u6x5ZpQlZ (http://twitter.com/Digitaltonto/status/800298776988622849?s=17)
oh holy my
Today, Innocentive has over 100,000 solvers that work out hundreds of problems so tough that even the smartest companies can’t crack them.
Greg Satell (@Digitaltonto) tweeted at 5:44 AM – 26 Nov 2016 :
Now, Anyone Who Wants Can Access The World’s Most Advanced Technology – https://t.co/HNWYX9PHK4(http://twitter.com/Digitaltonto/status/802493285075808256?s=17)fact, it’s so user friendly that people like marketing managers can actively participate in developing applications. Perhaps not surprising, demand is hot and the company is
this is not really collab
and this is why we still have to use words like.. competition
@DigitaltontoMy TED Talk about how to start a revolution that actually succeeds is finally up! youtube.com/watch?v=IOt1dL…planning organizing and discipline is what sets successful apart from non successful movementssuccessful movements bring inspire.. they bring people in .. because power will not fall simply because you oppose it.. but it crumbles if you bring those who support it over to your side
Greg Satell (@Digitaltonto) tweeted at 5:38 AM – 5 May 2018 :
It’s Not Just Facebook and Cambridge Analytica, The Internet Is Broken, But We Can Fix It – https://t.co/JKJlb1Sko0 (http://twitter.com/Digitaltonto/status/992730362139873280?s=17)It was Codd’s innovation that helped power the data economy as we know it today. Data could be stored centrally, but used remotely by whoever was given access to it. This gave it a value independent of the purpose for which it was originally stored, because *query languages could be **used to establish relationships that weren’t initially obvious or planned for..t
It also led to the problems with data security we have today.. t
Distributed computing requires distributed security. .what’s really needed is to truly secure our data infrastructure. .t
Josh Sutton: “Data is unique as an asset in that its *power to create value depends on how it can be *combined with other data. Once we can do that ***securely, we make data a far more liquid asset that will create far more value for everyone.”
Greg Satell (@Digitaltonto) tweeted at 6:18 AM – 26 Dec 2018 :
If you’re looking for some holiday reading, McGraw-Hill recently released an excerpt (Preface + Introduction) for my upcoming book, Cascades: How to Create a Movement that Drives Transformational Change.
robert sutton: his simple formula—small groups, loosely connected, and united by a common purpose—helen bevan: We are joining the ranks of Gandhi and the salt march, votes for women, and the same-sex marriage movement in finding a common cause that a diverse group of people can unite