A five percent decrease in measles vaccination rates could cause three times as many U.S. children to catch the virus annually, according to a study published in JAMA Pediatrics.
For the study, researchers used public data from the CDC to simulate vaccination rates for the mumps, measles and rubella vaccine in children ages 2 to 11 years. They created a mathematical model for infectious disease transmission to estimate outbreak size and distribution in relation to vaccine coverage.
About 93 percent of children ages 2 to 11 years receive the MMR vaccine annually in the U.S., the researchers note. If this rate dropped to 88 percent, the amount of children infected with the measles would triple based on the model's estimates. This increase in infections would cost federal health programs $2.1 million, not including hospital bills.
"Given increasing parental decisions to not vaccinate their children, we wanted to understand the effect of small reductions in vaccine coverage on overall measles cases," study co-author Nathan Lo of Stanford (Calif.) University School of Medicine told Reuters via email. "We found that small declines in vaccine coverage can really reduce the 'herd immunity' effect and result in more frequent and larger outbreaks of measles."
Researchers note the spike in measles cases and related costs would be even larger if the model included figures for infants, teens and adults.
More articles on clinical quality:
Lucille Packard Children's head of patient experience answers 5 Q's on $1.1B patient-centered expansion
3 quality measures that most influence overall HCAHPS scores
Better patient experience linked to lower mortality