How AI could be used to make life and death decisions
In the 2000s, an algorithm was developed in the US to identify recipients of a donor kidney. But some people are not satisfied with the way the algorithm is designed. In 2007, Clive Grawe, a kidney transplant candidate from Los Angeles, told a room full of medical professionals that their algorithm was biased against older people like him. The algorithm has been designed to allocate kidneys in a way that maximizes life years. Grawe and other patients argue that this favors younger, richer, and whiter patients.
Such bias in algorithms is common. What is less common is that the designers of such algorithms agree that there is a problem. After years of consulting with ordinary people like Grawe, designers have found a less biased way to maximize years of savings — among other things, considering overall health. next to age. One important change is that most donors, often those who die young, will no longer be matched with recipients of the same age. Some of those kidneys can now be transferred to older people if they are healthy. As with Scribner’s committee, the algorithm still won’t make decisions that everyone will agree on. But the process by which it was developed could hardly be more error-prone.
“I don’t want to sit there and inject drugs. If you want it, you press the button ”.
Nitschke is also asking tough questions.
A former doctor who burned his medical license after a years-long legal dispute with the Australian Medical Council, Nitschke is distinguished by being the first person to legally administer lethal injection. volunteers for others. During the nine months between July 1996, when Australia’s Northern Territory enacted a law legalizing death, and March 1997, when Australia’s federal government overturned it, Nitschke helped four patients his own person commits suicide.
The first, a 66-year-old carpenter named Bob Dent, who has had prostate cancer for five years, explained his decision in an open letter: “If I had to keep a pet in good condition Like me, I will be prosecuted. ”
Nitschke wants to support the patient’s decision. Even so, he was uncomfortable with the role they asked him to play. So he created a machine to take his place. “I don’t want to sit there and inject drugs,” he said. “If you want it, you press the button.”
The machine isn’t much to look at: it’s basically a laptop computer hooked up to a syringe. But it has achieved its purpose. The Sarco was an iterative version of that original device, which was later acquired by the Science Museum in London. Nitschke hopes an algorithm that can perform psychiatric assessments will be the next step.
But it’s very likely that those hopes will go up in smoke. Creating a program that can assess someone’s mental health is an unresolved issue — and a controversial one. As Nitschke himself notes, doctors disagree on what it means for a person with a clear mind to choose death. “You can get dozens of different answers from dozens of different psychiatrists,” he said. In other words, there is no common ground for building an algorithm.