Telehealth capabilities are limited since they do not have in-office features such as tools to measure vital signs. A team of researchers from the University of Washington in Seattle and Microsoft have developed a tool to change that.
The team of researchers invented a method that uses a smartphone or computer camera to take pulse and respiration signals from real-time videos of patients' faces, according to an April 1 news release.
Six things to know about the system:
- The method had previously been presented at a conference in December, but has since been updated to work better with a variety of cameras, lighting conditions and skin colors.
- The system runs on the device instead of a cloud network to preserve patients' privacy.
- It uses machine learning to capture subtle changes in how light reflects on a person's face, which correlates with changing blood flow.
- It uses blood flow data to compute a pulse rate and respiration rate.
- Even with updates, the team said the system is less accurate on darker skin tones because light reflects differently on these tones, which results in a weaker signal for the camera to pick up. However, they noted that they are working to develop new methods to solve this issue.
- The researchers are working on collaborations with doctors to see how it performs in the clinic.