News

‘Mourning into a Movement’: Family Members of George Floyd, Eric Garner Discuss Grief and Activism at IOP Forum

News

Expert Witness Says Northrop’s Lack of Summer Contact Was ‘Typical’ for University Case Workers

News

Harvard Residential Advisers Allege University Administrators Engaged in Union-Busting Tactics

News

‘Loses the Magic’: Cabot Students Frustrated at Decrease in N+1 Housing

News

Design School Student Gov. Urges Harvard to Divest From ‘Illegal Occupation of Palestine’

Bioethics Panel Discusses Smart Device Disease Diagnosis

The Harvard Medical School is located in Boston.
The Harvard Medical School is located in Boston. By Melanie Y. Fu
By Paul E. Alexis and Krishi Kishore, Crimson Staff Writers

Researchers, lawyers, and physicians discussed the ethical implications of using smart devices to collect data for diagnosing medical conditions in a virtual panel hosted by the Harvard Medical School Center for Bioethics on Friday.

The panel — which consisted of University of Florida law professor Barbara J. Evans and University of Pennsylvania medicine professor Jason H. T. Karlawish — was moderated by David A. Simon, a research fellow at the Petrie-Flom Health Law Policy, Biotechnology, and Bioethics Center at Harvard Law School.

Karlawish opened by describing the traditional clinical routine for gathering medical history from patients with neurological diseases like Alzheimer’s disease. He said that typically, a patient is accompanied by a spouse, child, in-law, or even a friend, who provides a second account of their medical history.

“We call that person the ‘knowledgeable informant,’ with the presumption that this individual has witnessed the day-to-day life of a patient with sufficient vividness and detail,” Karlawish said.

“The history we’ll get from them will inform us about whether the person or the patient has cognitive impairment,” he added.

However, Karlawish said a smart device could serve as an alternative to this “knowledgeable informant.”

“Alexa is just one example of technologies that begin to slip into the role of the knowledgeable informant. What Alexa is able to do is what that spouse does, which is watch the person, monitor their daily function,” Karlawish said.

Karlawish acknowledged such devices can also pose threats to a patient’s independence by discussing the ethical concerns of Alexa monitoring patients.

“You begin to see how these technologies become means to threaten someone's independence, someone's self-determination, their ability to live the way they want to live because the technology is being used to detect there’s some problems,” Karlawish said.

Evans said there is a trade off between protecting patient privacy and reaping the potential benefits of data sharing, which have varying importance in different settings. Sharing has a higher value in clinical healthcare due to the urgency of the situation, according to Evans.

“The purpose of the encounter is to keep someone alive or protect their health, so it's vital interests that are at stake,” she said. “There's a higher value in sharing the data than in other settings.”

Evans also discussed the utilitarian nature of medical privacy law, which she said may be unexpected as it does not closely resemble research ethics.

“It’s very utilitarian and that surprises people because we hope medical privacy law for clinical health care looks like research ethics, and it doesn't,” said Evans.

Nonetheless, Evans said data collected from consumer devices may not be reliable nor accessible, given their primary role as diagnostic devices.

“I worry about a diagnostic tool that might be learning what normal cognitive function is from a sample of people who use digital assistants,” Evans said. “I’m also concerned how many poor people are in the sample of data collected from these devices.”

Despite these concerns, Karlawish highlighted the crucial role technology has played in treating his neurological patients.

“I have patients who hold up their smartphone and say, ‘Without this, I’d be lost quite literally because the smartphone gives me the directions to get to someplace,’” he said.

Evans also said voice-powered digital assistants can promote “autonomy” in addition to collecting data.

“A digital assistant could help families pick up on signs that a new care arrangement may be needed,” Evans said. “You can envision a technology that would assess the competency of the person using a device and maybe disable some spending and functions when it’s not appropriate.”

Karlawish said he sees the utility of technology in future disease diagnosis, though not without modifications.

“I’m bullish about technology for detection and monitoring. I’d love it integrated into the healthcare system, but I want that healthcare system that humans can use and use it well.”

—Staff writer Paul E. Alexis can be reached at paul.alexis@thecrimson.com.

—Staff writer Krishi Kishore can be reached at krishi.kishore@thecrimson.com.

Want to keep up with breaking news? Subscribe to our email newsletter.

Tags
Harvard Law SchoolHarvard Medical SchoolHarvard in the World