intro’d to Aaron via Thomas.
here he is in 2013 intro’ing singularity u (he starts at 6 min):
turn a domain into an information based environment..
leverage info in powerful ways..
*this way might be that it’s a way to get 7 bill involved.. right away
turning dna into a programming language
38 min – black swan
Aaron’s postings at singularity hub. one on eyewire
Steven Kotler (@steven_kotler) tweeted at 5:12 AM – 27 Dec 2017 :
My friend @aarondfrank with a great piece on VR and the future of education—this is the revolution we’ve been waiting for! https://t.co/hhdqOTQRE1(http://twitter.com/steven_kotler/status/945990767969034240?s=17)
With VR, users can learn by doing. And that’s a big deal. Learning in this way may be far more effective than anything else out there.
When I spoke with the project’s lead, Kyle Nel, executive director of Lowe’s Innovation Labs, he told me customers who learned in VR instead of watching YouTube videos saw an almost 40% increase in their ability to recall the correct steps in the process.
As more VR companies like Labster and Tribe experiment in this area, expect VR to unlock the human capacity for learning by doing..t
Machines Teaching Each Other Could Be the Biggest Exponential Trend in AI by Aaron jan 2018
What we might call “machine teaching”—when devices communicate gained knowledge to one another—is a radical step up in the speed at which these systems improve. https://t.co/mWAqCjiyhw
Original Tweet: https://twitter.com/singularityhub/status/955155287824060416
“I think that this (self driving cars teaching themselves) is perhaps the biggest exponential trend in AI,” said Hod Lipson, ..t.. professor of mechanical engineering and data science at Columbia University, in a recent interview.
trend maybe.. but change we need..?
Lipson believes this way of developing AI is a big deal, in part, because it can bypass the need for training data..t
imagine a mech that can bypass our (assumed) need for training humans
“Data is the fuel of machine learning, but even for machines, some data is hard to get—it may be risky, slow, rare, or expensive. In those cases, machines can share experiences or create synthetic experiences for each other to augment or replace data. It turns out that this is not a minor effect, it actually is self-amplifying, and therefore exponential.”
sounds like automating ineq..
Lipson sees the recent breakthrough from Google’s DeepMind, a project called AlphaGo Zero, as a stunning example of an AI learning without training data.
imagine a breakthru.. a nother way.. as a stunning ie of people learning/living w/o training
Where it may take one driverless car significant time to learn to navigate a particular city—one hundred driverless cars navigating that same city together, all sharing what they learn..t..—can improve their algorithms in far less time.
at singularity u: