COVID-19 pandemic California announces that the state will mandate COVID-19 vaccines for healthcare workers, becoming the first U.S. state to do so. Sources: Los Angeles Daily Times