The merging of humans and machines is happening now

The merging of humans and machines is happening now
Her organisation invented the internet. It gave us the self-driving car. And now DARPA’s former boss sees us crossing a new technological boundary
By ARATI PRABHAKAR
Jan 28 2017
http://www.wired.co.uk/article/darpa-arati-prabhakar-humans-machines

The merging of machine capability and human consciousness is already happening. Writing exclusively for WIRED, DARPA director Arati Prabhkar outlines the potential rewards we face in the future – and the risks we face

Peter Sorger and Ben Gyori are brainstorming with a computer in a laboratory at Harvard Medical School. Their goal is to figure out why a powerful melanoma drug stops helping patients after a few months. But if their approach to human-computer collaboration is successful, it could generate a new approach to fundamentally understanding complexities that may change not only how cancer patients are treated, but also how innovation and discovery are pursued in countless other domains.

At the heart of their challenge is the crazily complicated hairball of activity going on inside a cancer cell – or in any cell. Untold thousands of interacting biochemical processes, constantly morphing, depending on which genes are most active and what’s going on around them. Sorger and Gyori know from studies of cells taken from treated patients that the melanoma drug’s loss of efficacy over time correlates with increased activity of two genes. But with so many factors directly or indirectly affecting those genes, and only a relatively crude model of those global interactions available, it’s impossible to determine which actors in the cell they might want to target with additional drugs.

That’s where the team’s novel computer system comes in. All Sorger and Gyori have to do is type in a new idea they have about the interactions among three proteins, based on a mix of clinical evidence, their deep scientific expertise, and good old human intuition. The system instantly considers the team’s thinking and generates hundreds of new differential equations, enriching and improving its previous analytical model of the myriad activities inside drug-treated cells. And then it spits out new results.

These don’t predict all the relevant observations from tumour cells, but it gives the researchers another idea involving two more proteins – which they shoot back on their keyboard. The computer churns and responds with a new round of analysis, producing a model that, it turns out, predicts exactly what happens in patients and offers new clues about how to prevent some cases of melanoma recurrence.

In a sense, Sorger and Gyori do what scientists have done for centuries with one another: engage in ideation and a series of what-ifs. But in this case, their intellectual partner is a machine that builds, stores, computes and iterates on all those hundreds of equations and connections.

The combination of insights from the researchers and their computer creates a model that does not simply document correlations – “When you see more of this, you’ll likely see more of that” – but rather starts to unveil the all-important middle steps and linkages of cause and effect, the how and why of molecular interactions, instead of just the what. In doing so, they make a jump from big data to deep understanding.

More than 3,220km away, another kind of human-machine collaboration unfolds at the University of Utah as Greg Clark asks Doug Fleenor to reach out and touch the image of a wooden door on a computer monitor.

Clark knows that Fleenor cannot physically touch this or any other object; Fleenor lost both his hands in a near-fatal electrical accident 25 years ago. But Fleenor’s arm has a chip in it that communicates with the computer, so when he moves his arm the image of a hand on the monitor also moves. He’s done this before – raising his arm, watching the cartoon hand move in sync and seemingly stroke
the face of the door – but this time it’s different. He lurches back and gasps. “That is so cool!” he blurts.

What’s so cool is that as he guides his virtual hand across that virtual plank, he literally, biologically and neurologically, feels its wooden surface. Thanks to some new software and an array of fine electrical connections between another embedded chip and the nerves running up his arm to his brain, he experiences a synthesised sensation of touch and texture indistinguishable from a tactile event.

For someone who hasn’t actually touched anything with his hands for a quarter of a century, this is a transcendent moment – one that points to a remarkable future that is now becoming real… and in Fleenor’s case, even tangible.

In ways as diverse as a shared understanding of causal complexity as in Peter Sorger’s lab and the seamless commingling of software and wetware as in Greg Clark’s lab, it’s a future in which humans and machines will not just work side by side, but rather will interact and collaborate with such a degree of intimacy that the distinction between us and them will become almost imperceptible.

[snip]

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s