Azeem Azhar and Gerd Leonhard had a good chat last week. It was a wide-ranging conversation on AI and its potential dangers to humanity – so I thought I would share a few notes on the talk. You can listen to the full podcast here.
The discussion was not about the usual killer robots and singularity stuff, but rather about a couple of existential questions that are on a more philosophical plain: what should we worry about in the next five, ten, fifteen years? And what should we be rightfully excited about? What could be the less good or potentially detrimental impact on people’s daily lives, if and when AI becomes the core layer or interface to everything we do?
In Gerd’s view AI is neither artificial (all tech is ‘artificial’) nor is it intelligent (in the human sense of that word), as there are many different types of intelligence: computational / intellectual intelligence, emotional intelligence, social intelligence and kinaesthetic intelligence, just to name a few (see pic below). AI should be a tool not a purpose, yet tools always shape how humans think, create and express themselves – from the Renaissance painters using paint and wooden panels, to AI in mental help and therapeutic contexts.
Clearly, there is a growing danger of reductionism and abdication in all this, and of course these issues also mirrors the reductionist priorities of our economic system, today: efficiency, productivity, and profit RULE. Dinner first, then morals.
In an overly capitalist society driven by technological gains, efficiency will clearly become more important than humanity.
If we want to avoid the dystopian digital scenarios, we need to address this root cause. We have to start thinking about the coming post-capitalist era, where we use technology not just for efficiency but for re-invention, and where we move from a reductionist focus on efficiency to one on human agency, relationships and experiences.
We realize that emotional resonance in relationships is often ritualistic, non-efficient, and that it takes time. It’s like putting the presents under the Christmas tree: for sure there must be a more efficient way to distribute presents, but do we want that?
Marvin Minsky is quoted as saying “What is very hard for a machine is easy for a human, and what is very hard for a human is easy for a machine.” Efficiency is for robots. We are not robots. We want awesome humans on top of amazing technology, but not one replacing the other.
Guest post by The Futures Agency content curator Petervan
- Gerd and Azeem now can be booked as a duo package
- The book “Technology vs Humanity”, now available in 10+ languages
- Exponential View: newsletter, podcast season http://www.exponentialview.co/