Mass General Brigham findings suggest FaceAge tool could provide objective data to help inform treatment decisions in cancer care and other chronic diseases.
Eyes may be the window to the soul, but a person’s biological age could be reflected in their facial characteristics. Investigators from Mass General Brigham developed a deep learning algorithm called FaceAge that uses a photo of a person’s face to predict biological age and survival outcomes for patients with cancer. They found that patients with cancer, on average, had a higher FaceAge than those without and appeared about five years older than their chronological age. Older FaceAge predictions were associated with worse overall survival outcomes across multiple cancer types. They also found that FaceAge outperformed clinicians in predicting short-term life expectancies of patients receiving palliative radiotherapy. Their results are published in The Lancet Digital Health.
"We can use artificial intelligence (AI) to estimate a person’s biological age from face pictures, and our study shows that information can be clinically meaningful,” said co-senior and corresponding author Hugo Aerts, PhD, director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham. “This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians. How old someone looks compared to their chronological age really matters—individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy.”
When patients walk into exam rooms, their appearance may give physicians clues about their overall health and vitality. Those intuitive assessments combined with a patient’s chronological age, in addition to many other biological measures, may help determine the best course of treatment. However, like anyone, physicians may have biases about a person’s age that may influence them, fueling a need for more objective, predictive measures to inform care decisions.
With that goal in mind, Mass General Brigham investigators leveraged deep learning and facial recognition technologies to train FaceAge. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets. The team tested the algorithm in a cohort of 6,196 cancer patients from two centers, using photographs routinely taken at the start of radiotherapy treatment.
Results showed that cancer patients appear significantly older than those without cancer, and their FaceAge, on average, was about five years older than their chronological age. In the cancer patient cohort, older FaceAge was associated with worse survival outcomes, especially in individuals who appeared older than 85, even after adjusting for chronological age, sex, and cancer type.
Estimated survival time at the end of life is difficult to pin down but has important treatment implications in cancer care. The team asked 10 clinicians and researchers to predict short-term life expectancy from 100 photos of patients receiving palliative radiotherapy. While there was a wide range in their performance, overall, the clinicians’ predictions were only slightly better than a coin flip, even after they were given clinical context, such as the patient’s chronological age and cancer status. Yet when clinicians were also provided with the patient’s FaceAge information, their predictions improved significantly.
Further research is needed before this technology could be considered for use in a real-world clinical setting. The research team is testing this technology to predict diseases, general health status, and lifespan. Follow-up studies include expanding this work across different hospitals, looking at patients in different stages of cancer, tracking FaceAge estimates over time, and testing its accuracy against plastic surgery and makeup data sets.
“This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age,” said co-senior author Ray Mak, MD, a faculty member in the AIM program at Mass General Brigham. “As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual’s aging trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives.”
Authorship: Additional Mass General Brigham authors include Dennis Bontempi, Osbert Zalay, Danielle S. Bitterman, Fridolin Haugg, Jack M. Qian, Hannah Roberts, Subha Perni, Vasco Prudente, Suraj Pai, Christian Guthier, Tracy Balboni, Laura Warren, Monica Krishan, and Benjamin H. Kann.
Disclosures: Mass General Brigham has filed provisional patents on two next-generation facial health algorithms.
Funding: This project received financial support from the National Institutes of Health (HA: NIH-USA U24CA194354, NIH-USA U01CA190234, NIH-USA U01CA209414, and NIH-USA R35CA22052; BHK: NIH-USA K08DE030216-01), and the European Union – European Research Council (HA: 866504).
Paper cited: Bontempi, et al. “Decoding biological age from face photographs using deep learning.” The Lancet Digital Health, DOI: 10.1016/j.landig.2025.03.002
Mass General Brigham is an integrated academic health care system, uniting great minds to solve the hardest problems in medicine for our communities and the world. Mass General Brigham connects a full continuum of care across a system of academic medical centers, community and specialty hospitals, a health insurance plan, physician networks, community health centers, home care, and long-term care services. Mass General Brigham is a nonprofit organization committed to patient care, research, teaching, and service to the community. In addition, Mass General Brigham is one of the nation’s leading biomedical research organizations with several Harvard Medical School teaching hospitals. For more information, please visit massgeneralbrigham.org.