AI and Facial Recognition Dive Into Global Health Care
In a rapidly digitizing world, our faces—wrinkles, pimples, beauty marks, and all—have transformed into one of the most valuable digital tools today. With facial recognition technology (FRT), matching images using artificial intelligence (AI) to identify a person has never been easier. But beyond this, FRT is slowly making its mark in health care, from scanning faces to control who comes in and out of health facilities to analyzing facial expressions to determine how healthy someone is.
As health-care organizations around the world embrace FRT, concerns about privacy, data security, and bias in its algorithms require a deeper dive to understand whether institutions are ready for what it brings.
FRT in Health-Care Settings
Patient safety is one emerging application of FRT. “It’s being used for monitoring . . . for example, [long-term care homes] for older people to monitor comings and goings,” says Nicole Martinez-Martin, assistant professor at the Stanford Center for Biomedical Ethics. The technology can identify patients, match medical records, and secure and audit people’s access to certain areas within a facility. In Los Angeles, Martin Luther King Jr. Community Hospital and the company Alcatraz AI implemented an FRT system to enhance security in server rooms where private data and technology are stored.
FRT is 34% less accurate in identifying darker-skinned female faces than lighter-skinned male faces
“There’s also been work around using it for genetic diagnostic purposes,” says Martinez-Martin. She points out that FRT can detect rare genetic diseases with recognizable facial patterns and features more quickly. A popular example used by health-care professionals globally is the Face2Gene app. The AI program can compare facial features with those linked to certain conditions, suggesting the most likely matches. A Japanese study found that Face2Gene had an 85.7% accuracy rate for a cohort of individuals with congenital dysmorphic syndromes. Beyond genetic screening, FRT can also detect emotions and behaviors that are associated with conditions such as autism, says Martinez-Martin.
“Pain screening is another area where there is discussion,” she adds. PainChek, an Australian FRT app, can detect pain in those living with dementia by tracking facial muscle movements. The company stated last summer that it planned to seek approval from the U.S. Food and Drug Administration.
“The idea is, let’s take away clinical subjectivity,” says David Allsopp, head of business development at PainChek. St. Michael’s Health Group (SMHG), one of two Canadian long-term care homes in Alberta that piloted the technology, has been using it for less than a year and has seen major benefits.
Tatsiana Haidukevich, director of care at SMHG, adds that one resident with behavioral concerns was screened and found to have high pain levels. With consistent monitoring, staff switched the behavioral medication to pain medication, alleviating the resident’s symptoms.
“I do [see this technology expanding] . . . not only in nursing care homes but hospitals as well,” she says.
Racial and Gender Bias in FRT
A 2018 study on FRT shed light on a serious problem with the technology—its racial and gender bias. The study showed that FRT is 34% less accurate in identifying darker-skinned female faces than lighter-skinned male faces. This trend is often due to a lack of inclusive testing and biases embedded within FRT algorithms.
“I do not see myself advising any agency using it on darker-skinned faces,” says Gideon Christian, assistant professor in AI and law at the University of Calgary. “It’s a problem that has to be addressed at the design stage, not at the deployment stage.”
In 2020, two Somali women in Canada had their refugee status revoked after they were mistakenly identified as Kenyan based on border officials’ alleged use of FRT. In 2023, a pregnant Black woman living in Detroit, Michigan, was mistakenly arrested in front of her children based on false identification through FRT used by police.
The shortcomings of FRT, particularly for Black women, can mean the difference between life and death, which applies especially to institutions and systems that serve the public, including law enforcement and health care. Inspired by the COVID-19 pandemic, the increased use of masks in health-care settings presents an added challenge with FRT use. A study analyzing the performance of the FaceVACS FRT software found that its 99.7% accuracy rate dropped to 33.5% when used on facial images that blocked out facial areas covered by a mask.
“We’re doing more research with African American and Latino cohorts,” says Allsopp, regarding the training of the PainChek algorithm. “The validity of the tool has been fairly even between Indigenous Australians and Anglo-Saxon people.”
In addition to facial identification, health-focused FRT applications such as PainChek can interpret cues such as body movements and speech patterns to paint a fuller picture of an individual’s well-being.
“The way machine learning is trained relies on subjective judgments . . . how [people] express emotions is part of the process,” says Martinez-Martin. A study using FRT to interpret emotions associated with pictures of basketball players found that negative emotions were more commonly attributed to Black men’s faces than white men’s, even when men were smiling.
“People react differently . . . some are more stoic,” says Haidukevich. SMHG staff found that some residents refrain from expressing emotions with their faces because of cultural norms and upbringing.
To compensate for this, SMHG uses PainChek as just one component of pain assessments rather than relying solely on the tool.
The way machine learning is trained relies on subjective judgments . . . how [people] express emotions is part of the process
Nicole Martinez-Martin
Facial Recognition Technology and Privacy Concerns
Recent scandals involving misuse of FRT have raised alarm bells about data sharing and privacy. Clearview AI, an FRT startup company, infamously came under fire in 2021 for illegally scraping social media websites for facial photos used in a database by governments and police departments globally. This resulted in multiple lawsuits and restrictions against the company. In 2023, the Federal Trade Commission charged the U.S. pharmacy chain Rite Aid for its use of FRT to surveil shoplifters in predominantly low-income, non-white neighborhoods. “Lack of knowledge as to how intrusive these technologies are has created a societal tolerance of the technology,” says Christian.
The transformation of faces into data may seem harmless, but the security risks associated with data sharing cannot be ignored. “It’s the same reason you don’t leave your house open for strangers . . . you have the right to autonomy within that space,” says Christian.
Data is a resource, and public education is critical to ensuring that individuals are aware of how their data is being used. FRT goes beyond just taking a photo: the algorithms take that photo and convert it into a mathematical map of the face, which is classified as biometric information. Christian warns that biometric capture does not fall within the consent of simply taking a picture or video of someone. This issue was highlighted during a 2023 investigation of the Canadian Tire retail chain, which failed to obtain legal consent from customers when it used FRT to capture their biometrics.
“There was a concern about privacy . . . we wanted to make sure that everyone knows what we’re doing,” says Haidukevich.
SMHG initially faced questions from residents and families about whether pictures were being taken and stored within PainChek. Education efforts helped curb fears and clarify how the tool would be used.
“There [are] no photos taken, there [are] no videos taken . . . we take a very limited set of data in regard to looking at [personal information],” says Allsopp. In its privacy policy, PainChek outlines that data may be used to “verify an individual’s identity” or when it is necessary for the company’s legitimate interests or those of a third party, including legal obligations.
FRT has the potential to transform the way that health care is delivered through enhanced communication, better diagnoses, and improved safety. But a critical perspective is necessary to ensure that patients and health-care professionals are fully aware of FRT’s limitations and risks, and to advocate for a technology that will truly work for all.
link