In the second half of the 20th century, we saw computers as powerful machines for doing calculations. They were faster and more accurate than humans, and could handle more variables and solve more complex problems.
A short half-century later, we are on the threshold of co-evolution of humans with digital machines. We are not quite codependent -- for a while longer we'll still be able to live without machines, and the machines will still largely depend on us. But we are heading for a world where each relies on the other to stay alive.
Watching my grandchildren play with handheld digital devices before they learned to talk helped me understand that learning itself was changing. When I was growing up, we had to learn "material" and master "fields." Now, the critical skill is "finding" the information you need and "navigating" fields of activity. Today's key skill set -- and what will set people apart -- is knowing how to "search' and "find" information, which is more important than mastering content that seems to grow in volume every second.
I read the MIT Technology Review regularly because it helps me understand modern technology and where it's going. One development described in the latest issue is customized, 3D-printed heart implants -- such as pacemakers or defibrillators -- that are computer-designed to work with your specific heart. Another development is genome editing: the ability to target specific genes with the aim of introducing or withholding a specific genetic capacity. (This was successfully done last year with monkeys.)
A host of startups are working on wearable sensors that can monitor your health and alert you when action is required to protect it. And scientists at Boston Dynamics, a Google-owned company, have created robots that can recover their balance rapidly and thus have the agility to walk or run across uneven terrain, bringing us closer to having a "robot army." But before we get there, we will probably get the "robot home health aide," which will dispense medicine, prevent an elderly person from falling, and accompany people outside the home.
My last example is a robot named Pioneer, built by Qualcomm in San Diego, that operates on neuromorphic chips. This allows the robot to learn immediately from someone else's gestures or instructions, rather than requiring comprehensive programming. Pioneer can recognize and deal with people or objects it hasn't seen before because of their resemblance to what they've seen previously. That's getting pretty similar to the human nervous system -- which is why, I suppose, the chips are called neuromorphic.
The trends are clear. We live in a time when we are heavily dependent on computers for the security of our energy systems; for tracking our finances; and for managing much of our medical care, from preparing treatment plans to custom-designed drugs. More and more we will rely on computers to design and monitor complex societal tasks, including disease detection and identification, military weapons design and anti-terrorism efforts, including protecting us from other computers.
Today, robots and computers need us to design, update and maintain them -- but gradually supercomputers and superrobots that can share in designing and caring for their successors will emerge. And then we will need them to help design more advanced generations of themselves.