Patterns of facial weakness in patients with myasthenia gravis (MG) can be detected with facial recognition software, according to a new study published in the journal Annals of Clinical and Translational Neurology.

The study aimed to establish whether facial weakness in MG can be automatically quantified and used to diagnose and monitor the disease.

Read more about the diagnosis of MG

For the study, a team led by Martijn R. Tannemaat, MD, PhD, from the Leiden University Medical Center in the Netherlands, analyzed video recordings from 70 patients with MG and 69 healthy volunteers using 2 different methods. 

The researchers first quantified facial weakness using software that recognizes facial expressions. Then, they trained a deep-learning computer model to classify disease diagnosis and severity using multiple cross-validations on videos of 50 healthy controls and 50 patients.

The researchers reported that the expression of fear, happiness, and anger significantly decreased in patients with MG compared to healthy people. Furthermore, they could detect specific patterns of decreased facial movement associated with each emotion. 

The area under the curve (AUC) of the receiver operator curve was 0.75 for the deep learning model for diagnosis with a sensitivity and specificity of 0.75 and an accuracy of 76%. This value was 0.75 for disease severity with a specificity of 0.63, sensitivity of 0.93, and accuracy of 80%. 

For the validation set, the AUC for diagnosis was 0.82, with a sensitivity of 1.0, specificity of 0.74, and accuracy of 87%, while for disease severity, the AUC was 0.88 with a sensitivity of 1.0, specificity of 0.86, and accuracy of 94%.

“This study delivers a ‘proof of concept’ for a [deep learning] model that can distinguish MG from [healthy controls] and classifies disease severity,” the researchers concluded.


Ruiter AM, Wang Z, Yin Z, et al. Assessing facial weakness in myasthenia gravis with facial recognition software and deep learning. Ann Clin Transl Neurol. Published online June 9, 2023. doi:10.1002/acn3.51823