/Warnings of a Dark Side to A.I. in Health Care

Warnings of a Dark Side to A.I. in Health Care

Futurizonte Editor’s Note: What? AI has a dark side? What is this, Star Wars? Seriously, if AI is taking over health care and mostly everything else, AI having a dark side could be a real problem.

Scientists worry that with just tiny tweaks to data, neural networks can be fooled into committing “adversarial attacks” that mislead rather than help.CreditCreditJoan Cros/NurPhoto, via Getty Image

Scientists worry that with just tiny tweaks to data, neural networks can be fooled into committing “adversarial attacks” that mislead rather than help.
CreditCreditJoan Cros/NurPhoto, via Getty Image

By Cade Metz and Craig S. Smith
March 21, 2019

Last year, the Food and Drug Administration approved a device that can capture an image of your retina and automatically detect signs of diabetic blindness.

This new breed of artificial intelligence technology is rapidly spreading across the medical field, as scientists develop systems that can identify signs of illness and disease in a wide variety of images, from X-rays of the lungs to C.A.T. scans of the brain. These systems promise to help doctors evaluate patients more efficiently, and less expensively, than in the past.

Samuel Finlayson, a researcher at Harvard Medical School and M.I.T. and one of the authors of the paper, warned that because so much money changes hands across the health care industry, stakeholders are already bilking the system by subtly changing billing codes and other data in computer systems that track health care visits. A.I. could exacerbate the problem.

“The inherent ambiguity in medical information, coupled with often-competing financial incentives, allows for high-stakes decisions to swing on very subtle bits of information,” he said.

The new paper adds to a growing sense of concern about the possibility of such attacks, which could be aimed at everything from face recognition services and driverless cars to iris scanners and fingerprint readers.

The New York Times

Read the complete original article here.