“Well over 300,000 Americans are in graveyards today because of the misinformation, the doubt, the suspicion, the distrust that caused them to say that vaccine is not safe for me. And it continues,” Dr. Francis Collins, the former head of the National Institutes of Health said in September 2022, citing a KFF estimate.

Disinformation not only continues but it’s getting worse. Public trust is declining due to social media, and increasingly, artificial intelligence (AI). As the COVID-19 pandemic showed, distrust can lead to preventable deaths.

Confidence continues to decline

Last month, the Florida Surgeon General called for a halt to COVID-19 vaccines, saying they can cause permanent harm. The FDA and CDC refuted his claim, but the harm is done. Last September, Florida Governor Ron DeSantis, who backs his Surgeon General, said, “I will not stand by and let the FDA and CDC use healthy Floridians as guinea pigs for booster shots that have not proven to be safe or effective.”

A 2023 UNICEF report warned that confidence in childhood vaccines in many countries continues to decline, and a 2023 PEW study found that 28 % of Americans said parents should be able to decide not to vaccinate their kids, up 12% from the previous year.

Recently, the U.K. had to launch a campaign to persuade parents to have their kids vaccinated for measles, mumps, and rubella, following an increase in cases and a decrease in vaccination rates.

Worse, politicians are among the spreaders of disinformation. People who oppose vaccine requirements are winning seats in state legislatures. In Louisiana, 29 candidates endorsed by a national group that works to defeat mandatory vaccinations, won in state elections last fall.

Some groups appear to be more vulnerable. A survey taken a few years ago indicated that Black respondents anticipate discriminatory treatment for COVID-19. And the roots of Black distrust in health care go back well before that. As Karen Bullock, a professor at Boston College, told me, “Why should Black people believe that a health care system is going to honor their wishes when, for your lifetime, the system hasn’t honored your wishes?”

Shifting tactics

The Coalition for Trust in Health & Science, an alliance of 90+ organizations, is working to combat disinformation. Their chairman, Dr. Reed Tuckson told me, “We live in an age of manufactured mistrust, and it works. People are drowning in…articles, videos, memes and posts. They don’t have a firm grasp on what to believe.”

Social media contributes to the problem. “It’s not clear that social media platforms can or even want to limit disinformation,” Lorien Abroms of George Washington University told me.

Her research found that Facebook tried to remove vaccine misinformation but failed to do so in a way that would have lasting effects. “They may want to limit disinformation, at least for some topics, but actually don’t know how given the dynamics of their system; it’s a moving target,” Abroms said.

Even Big Tech CEOs like Marc Benioff of Salesforce agree that regulators have not done their jobs regarding social media.

Another health emergency is climate change. The most vulnerable to its effects are children, seniors, pregnant women, those with chronic illness, and people living near toxic substances.

And climate disinformation appears to be evolving. Ed Maibach of George Mason University, an expert on climate communication, told me that most disinformation previously denied there is a problem at all. (A certain U.S. president famously claimed that climate change is a “hoax.”)

Now, Maibach explained, since outright climate denialism is less persuasive, “disinformation agents are instead trying to undermine public confidence in alternatives to fossil energy, like wind and solar power, and EVs.”

Adding AI to the mix threatens not just trust in health and science but also in public institutions and even democracy itself. The World Economic Forum in Davos recently released a survey of nearly 1,500 experts, business leaders, and policymakers who said that AI-fueled disinformation poses a near-term threat to the global economy.

Fake news and propaganda by means of AI could make it easier to influence elections and increase social conflicts. At Davos, Bill Gates predicted that with AI tools, “the bad guys will be more productive.” And Irish Prime Minister Leo Varadkar talked about AI-generated fake videos of him peddling cryptocurrency on the internet.

Companies have a big stake in communicating truth and transparency to their many stakeholders, including employees, customers, and regulators. Employers thrive and compete with healthy workplaces. Employee health benefits are among their highest expenditures, and disinformation could lead to higher health care costs and absenteeism.

Susceptibility to disinformation threatens everyone, from school kids to voters to corporate CEOs. We are past the point of simply applying healthy skepticism–we all need to do more to protect ourselves, our families, and our constituents.

Bill Novelli is a professor emeritus at the McDonough School of Business at Georgetown University. He was CEO of AARP and at Porter Novelli, the global PR firm. His latest book is Good Business: The Talk, Fight, Win Way to Change the World.

More must-read commentary published by Fortune:

The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs of Fortune.

Subscribe to the new Fortune CEO Weekly Europe newsletter to get corner office insights on the biggest business stories in Europe. Sign up for free.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *