One evening last November, a fifty-four-year-old woman from the Bronx arrived at the emergency room at Columbia University’s medical center with a grinding headache. Her vision had become blurry, she told the E.R. doctors, and her left hand felt numb and weak. The doctors examined her and ordered a CT scan of her head.
A few months later, on a morning this January, a team of four radiologists-in-training huddled in front of a computer in a third-floor room of the hospital. The room was windowless and dark, aside from the light from the screen, which looked as if it had been filtered through seawater. The residents filled a cubicle, and Angela Lignelli-Dipple, the chief of neuroradiology at Columbia, stood behind them with a pencil and pad. She was training them to read CT scans.
“It’s easy to diagnose a stroke once the brain is dead and gray,” she said. “The trick is to diagnose the stroke before too many nerve cells begin to die.” Strokes are usually caused by blockages or bleeds, and a neuroradiologist has about a forty-five-minute window to make a diagnosis, so that doctors might be able to intervene—to dissolve a growing clot, say. “Imagine you are in the E.R.,” Lignelli-Dipple continued, raising the ante. “Every minute that passes, some part of the brain is dying. Time lost is brain lost.”
For roughly three decades, medical robots have assisted surgeons in the operating theater. They provide a steady hand and can make tiny incisions with pinpoint accuracy. But as robotics improve, a new question has emerged: How should autonomous robots be treated? The US Food and Drug Administration (FDA) approves medical devices, while medical societies monitor doctors. A robot that can operate on its own falls somewhere in between. To help, Science Robotics has produced a scale for grading autonomy in robot-assisted surgery. If adopted, it could help regulators decide when and how machines should be treated like humans.
IBM chose December’s annual meeting of the Radiological Society of North America to showcase the ability of its Watson supercomputer to rapidly analyse medical images and suggest a diagnosis. Mark Griffiths, a clinical radiologist at University Hospital Southampton NHS Foundation Trust, who attended RSNA, says he saw some “stunning demonstrations” of the technology, including chest X-rays being “reported in milliseconds.”
Watson is an example of a technology that IBM refers to as “cognitive computing”. Using a form of artificial intelligence known as natural language processing, Watson, a cloud-based system, is able to analyse vast stores of scholarly articles, patient records and medical images. (When IBM acquired Merge Healthcare in 2015, it gained access to the company’s database of 30 billion images.) This ability to interpret written language is what marks Watson out as different from other computer-based tools used to aid diagnosis.
Robots are entering the hospital room, surgery ward and doctor’s office at an increasing rate. Shipments of medical robotics used for surgery, rehabilitation and hospital tasks will triple over the next five years, with revenues jumping to $2.8 billion from $1.7 billion, a recent report from Tractica predicts.
While robotics already enhance procedures such as spine surgery, universities and healthcare technology companies alike are pushing the limits of what’s possible and developing ever more innovative ways to introduce robotics into the health field in coming years.
As I start to look at the emerging trends of 2017 from the vantage of IndieBio, where we see hundreds of biotech startup applications and technologies per year, a few key themes are already emerging. Even as political landscapes change, science and technology continue to push forward.
From cloud platforms for medical data and hospital smart rooms to artificial intelligence and patient-engagement technologies, the giants of the digital world are threatening to disrupt healthcare.
Leading the pack is IBM and its centerpiece offering Watson Health. In just the last six months, the company has announced major initiatives into healthcare including a partnership with clinical consultation provider Best Doctors to add Watson’s cancer suite to employee benefits packages, a population health management alliance with Siemens Healthineers and an effort linking IBM’s PowerAI deep learning software toolkit with NVIDIA’s NVLink interconnect technology. The PowerAI is already being used improve diagnoses and care plans by sifting through patient data.
The sooner a disease is diagnosed, the more likely it is to be well managed or cured. The challenge to finding a disease early is that most of us don’t seek treatment until we have symptoms, which means the disease has already progressed.
But breakthroughs in nanobiotechnology techniques mean that in five years we will be able to examine and filter bodily fluids for tiny bioparticles that reveal signs of disease like cancer before we have any symptoms, letting us know immediately if we should consult a doctor.