American Association for Physician Leadership

Medical Misinformation, Tech Platforms, and the Threat to Our Public Health

Michael J. Sacopulos, JD


Kara Swisher


July 1, 2022


Physician Leadership Journal


Volume 9, Issue 4, Pages 52-56


https://doi.org/10.55834/plj.7355438898


Abstract

This engaging SoundPractice episode is a tutorial for physicians on the tech landscape, the lack of accountability for tech companies, and an overview of Section 230 of the Communication Decency Act. Host Mike Sacopulos and Kara Swisher also discuss how social media has turbo charged the anti-vax messaging putting physicians on the front lines with patients, how platforms such as Facebook, YouTube, Amazon and Reddit proliferate the disinformation, and what physicians and healthcare professionals can do to help stem the false information.




Incoming FDA Commissioner Robert Califf recently announced his concern about how social media magnifies and disseminates false information about science. This engaging episode of SoundPractice is a tutorial for physicians on the tech landscape, the lack of accountability for tech companies, and an overview of Section 230 of the Communication Decency Act.

Host Mike Sacopulos and journalist Kara Swisher also discuss how social media has turbocharged the anti-vax messaging; how platforms such as Facebook, YouTube, Amazon, and Reddit proliferate disinformation; and what physicians and healthcare professionals can do to help stem the false information.

This transcript of their discussion has been edited for clarity and length.

Mike Sacopulos: Recently, incoming FDA Commissioner Robert Califf announced his concern that misinformation about science is increasingly prevalent. Social media, whether intentionally or unintentionally, magnifies, propagates, and disseminates misinformation quickly. Mark Twain famously said, “A lie can travel halfway around the world before the truth puts on its shoes.” If Twain had Facebook, that lie would’ve made it farther.

In this episode, we speak with a national expert on the impact of social media. Kara Swisher is a thought leader and journalist par excellence prepared to tackle one of the vexing issues of our age. Author, technology columnist, and podcast host Kara Swisher has been on the tech scene since the early ’90s. She has written for The Wall Street Journal and The Washington Post. Welcome, Kara.

Kara Swisher: Thanks very much.

Sacopulos: SoundPractice is a podcast to educate physicians on issues that have an impact on their patient care and their positions as physician leaders. Certainly, platforms such as Facebook, Twitter, and Google have substantial control over information Americans see. And I know you’ve been vocal about the lack of regulation surrounding tech companies and the content that they are producing.

Of course, what comes to mind is COVID-19 misinformation, right? We know the playbook: COVID isn’t dangerous, the vaccine is dangerous, and you can’t trust doctors or scientists. How did we end up here?

Swisher: Well, I think it’s been part of a long stretch of the ability of these companies to act as platforms and not as media companies. They are media companies in many ways, and they get the protections of platforms, which come from a law that was passed many, many years ago called Section 230.

The companies that are running these platforms have no responsibility to put out accurate information, and they become de facto news organizations; it’s how people get their information. Whether they like it or not, these tech companies are media companies. You’ve seen it just recently around Joe Rogan and Spotify. In the case of Facebook, they don’t pay most of the contributors, and they certainly do make editorial decisions. It’s been growing for a long time.

Lenny Pozner, who’s a parent of one of the kids murdered at Sandy Hook Elementary School, called what was happening “the canary in the coal mine.” It was Alex Jones who posited that this tragedy was a hoax and that the kids were child actors. Most of that happened online and Jones was joined by a lot of people in what Pozner called a giant party to do detective work on something fake, including putting up fake memes.

Going from there, the platforms did nothing. Pozner has struggled to get the false information taken down. He was continually bullied, demonized, and attacked. The tragedy just added onto itself, and these platforms did nothing to stop it. It should have been a warning signal to us what was happening there.

It took them forever to get Alex Jones off the platform, pretending it was a free speech issue, which it is not — it is a false information issue. It’s about a lot of things, but the companies like to wrap themselves in the First Amendment.

So here we are today, same thing, except they’re better at it. Different forces are doing it; this time it’s anti-vax people. You can argue about the efficacy of masks. You can argue about mandates. That’s certainly a worthy thing for a democracy to argue about. You can argue about the importance of public health over individual responsibility, over individual freedom. That’s worthy of democracy to discuss that.

But what happens is the platforms flood the zone with all kinds of misinformation or give equal time to patently false things and then do nothing about it. Spotify is the perfect example of this. They’re trying to pretend they’re just a platform, but they have a media relationship with someone. Rogan put forward some speakers that he didn’t push back on; he just allowed them to say whatever they pleased.

Rogan has a very powerful platform. The issue is with the platform itself and not necessarily with Joe Rogan. The continuum of these platforms allowing this to happen is a really fascinating way that we allow the technology to take over and control us.

Sacopulos: Facebook, now Meta has a market cap of $800–$900 billion, although that’s dropped a little bit recently. Facebook and other tech companies have reported that they just can’t stay on top of all of the disinformation. It seems they would have the resources or be able to deploy solutions to stay on top of it. Is the situation that they can’t or that they just don’t want to?

Swisher: Well, it’s complex, obviously. The fact of the matter is Facebook has never been more lucrative, but they don’t want to pay for it. It’s costly to do what they need to do. At one point, Mark Zuckerberg, one of the founders of Facebook, said, “You can’t control this. It’s very hard to do. You can’t moderate it as easily.” And I remember saying to him, “Why did you build a platform you can’t moderate? Too bad; it’s your responsibility, it’s on you.”

I don’t think they don’t want to control it; they’ve made all kinds of efforts to do so. Ultimately, it’s the architecture of the situation, which allows no editorial control except when they feel like it. They edit all the time. That’s the irony. They say, “We’re not editors.” I say, “Well, why did you take this out? Why did you take that out?”

Zuckerberg changed his mind on Holocaust deniers. One day they were fine, the next day they weren’t. They are at the mercy of this guy who just changes his mind, but never cares to be accurate.

Under the laws that we have, there are no ways for anybody, including someone like Lenny Pozner, to fight back against misuse. There’s the inability to sue the platforms, no liability, free and unfettered power, and no regulation. We’re at the mercy of one or two people who try to tell us what our society should do. I think that’s a prescription for disaster.

Sacopulos: You alluded earlier to Section 230. [Section 230 is part of the Communications Decency Act that was passed in 1996.] That is where we’re headed right now. Can you tell our audience a little bit about what that section says and why it’s problematic?

Swisher: It was part of the larger act, much of which was declared unconstitutional by the Supreme Court on the grounds it infringed on the right of free speech. But this particular part was to protect online platforms that host or republish third-party content against laws that otherwise might hold them responsible for that content.

It’s very short and was amended by the Stop Enabling Sex Traffickers Act to require removal of material that violates federal and state sex trafficking laws. It seems like they may try to also revoke protection for health “misinformation.”

Sacopulos: One suggestion has been to say that algorithms are not covered under Section 230, correct? That an online platform that uses an algorithm to recommend content to a user based on their personal information is liable if the content contributes to a physical or emotional injury.

Swisher: Yes!

Sacopulos: Maybe the content that is placed onto a platform by a third party would be covered, but how it is displayed or prioritized by the platform would not come under the 230 exemptions. How do you feel about that?

Swisher: Like any law, there can be real problems. One of the laws that I have a problem with is the one where one agency of the government, I think it is Health and Human Services, gets to decide what’s science. Well, the last administration recommended bleach as a cure, so we don’t want that to happen, right? I don’t want to put it in the hands of a government agency to say what is science.

The problem is that this stuff goes in every direction. It’s COVID and healthcare today, and before that it was child actors acting out a massacre [Sandy Hook] which was an obvious massacre, or even further back, denying the Holocaust.

Of course, there were the election lies. What do you do about Trump putting out election lies? Well, they pulled him off the platforms, which is editing, right? They edited him off the platform, whether they like to say it or not. I think they used the concept of deplatforming, but it is editing. “We’ve had enough of you and you’ve broken our rules and you’re gone.” It’s more like kicking someone out of a bar, I guess. “Get out of our bar. We don’t like you” or “You broke our rules.”

Change is going to be difficult. The U.S. government hasn’t done anything, but in Europe, they’re doing a lot of things. They’re in a big beef with Facebook over where servers are located. Facebook wants to put more servers in Europe, and Facebook has threatened to leave. Europe’s answer is, “Bye, don’t let the door hit you on the way out.”

In Europe, the struggle is often around privacy issues, which is, I think, where the U.S. should focus. A focus on privacy gets at the heart of their business model where it may effect change in other ways.

There are lots of different suggestions, lots of laws, lots of enforcement, but algorithms are one way to do it. Giving money to our enforcement agencies would be nice. I just interviewed a major government official who said, “We’re too small to fight these companies.” Honestly? The U.S. government’s too small to fight these companies? Think about that.

Sacopulos: Is that the inverse of too big to fail: too small to regulate?

Swisher: Yeah. Lina Khan, the head of the FTC, has been a critic of tech companies. Facebook and Amazon are trying to get her recused from anti-trust investigations of their companies. But that’s why she got hired, because she’s a critic of tech companies. She has correctly noted that the number of mergers has just ballooned.

She’s got a staff that’s smaller than in the 1980s, I think. Money for expansion is stuck in Congress. They can’t pass basic laws. They say everything’s sucked up into these big bills. The money for enforcement for both FCC and the Justice Department was in one of those bills.

Sacopulos: Maybe it’s just stating the obvious, but it seems a bit dangerous to have the enforcers publicly admit that they’re unable to properly enforce. Right? That’s almost a green light to go forward and do what you like.

Swisher: That’s what they’re doing. They just flooded the agency with all kinds of mergers. At some point, the FTC’s got to pick one and make an example of it, but what if the FTC loses? So you have to be super careful, and it makes for a very cautious agency because you don’t want to blow it by choosing the wrong example.

My issue is that all these companies are unaccountable. You may hate your politicians, whatever side you’re on. You may think politics stink. But at least politicians are elected, even if they are involved in corruption. Nobody elected Facebook, except by using it. That’s not an election; it’s a consumer choice.

Sacopulos: Emergency room physicians are certainly now on the front lines for the anti-vax rhetoric coming from unvaccinated patients, and pediatricians have been dealing with this for many years. The anti-vax movement is not new, and people somehow forget that we’ve had outbreaks of measles in Oregon and Orange County, California, all tied back to the anti-vax messaging. So it appears that social media has turbocharged all of this. Would you agree?

Swisher: Oh yes, 100%. They’ve gotten turbocharged by the internet and gained the ability to use these tools. These are malevolent players following a playbook, and they’re getting better at it. They pop up here, they pop up there. It’s hard to regulate them.

Facebook and others have tried hard to regulate them. It’s just that they waited too long. On one hand, it’s great for people to protest whatever they feel like protesting, on the other, it creates the ability to put out false information.

I don’t mind if people have an argument, “Here’s my take and this is what it is.” And I’m sure doctors hate it. You can only do so much with people, but there is such a thing as the public good and public health. We all agree not to go through a stop sign.

Sacopulos: We’ve seen some specific targeting that is frightening, moving in on monstrous. An example is the Orthodox Jewish community and the Somalian community that were targeted specifically with misinformation about vaccination. And so we’ve seen incredibly low vaccination rates and correspondingly higher infection rates. Any wonder why physicians have difficulty gaining trust around vaccination in this kind of community?

Swisher: Issues around religion are very difficult to parse out. There’s so much false information, it’s hard to fight against it. You can show people numbers and statistics, but I showed numbers and statistics to a relative who said, “Well, that’s your opinion.” I replied, “No, these are the facts.”

What do you do with the person who says, “This is my belief?” You have to get the government involved, and these people have to be isolated, I guess. I don’t know. That’s the discussion we need to have. If someone is not going to join in something that’s a public health emergency, maybe that’s something our public officials have to start to talk about.

Of course, regulations are where the real problems are. You go to Florida, you have one set of rules. You go to New York, you have another set of rules. You go to San Francisco, you have another set of rules. So I’m not sure it’s ever going to be resolved. The diversity has been made more powerful by internet tools for people to be able to press their particular case.

Sacopulos: Some social media platforms —Twitter, Reddit — have started to do more to stem the flow of misinformation, particularly during the pandemic. Twitter’s labeling manipulated media, and multiple platforms now have reporting options so things can be flagged. Isn’t this somewhat of a game of Whack-a-Mole?

Swisher: Yes, that’s exactly what it is. There’s no other way to put it. People get more and more sophisticated in how to manipulate these platforms. I would not want to have to deal with this, but social media platforms need to impose stronger editorial rules on this stuff, and sometimes they’ll get it wrong. They’re fearful of getting it wrong, more than they are fearful of removing wrong things from the platform. That happened a couple of years ago when Facebook was taking down breast cancer information.

You can make mistakes, then you correct them. One of the biggest problems is you don’t have a lot of choices; that’s an issue around power and consolidation. And that’s, again, a thing for our government officials to deal with.

Why are these companies allowed to get so big that there are no other options? When Parler got knocked off Amazon and Google and Apple for violating its rules, you’ve got to wonder why it was Parler and not Facebook, since Facebook was one of the biggest purveyors of misinformation around January 6th. They were the tools, the gathering point, and the purveyor of misinformation. Parler didn’t behave right either, but why them versus Facebook? Think about that.

I don’t agree with most of these people who think the platforms are heinous in so many ways, but at the same time, should you build more and more things so that people can have their platforms to disseminate their information? I think concentration’s a real problem.

Sacopulos: Right. And it seems like another alternative is to break up the key platforms, correct?

Swisher: Yes.

Sacopulos: Not unlike what we did with Ma Bell, where we broke it into different telephone companies.

Swisher: There are no choices. Trump is supposedly creating Trump [Truth] Social. We’ll see. But there’s a lot of social networks; there’s Rumble and GETTR and Parler and MeWe, and you can have whatever political side you want to be on, but there should be lots of choices. That’s the best way to deal with this stuff. When there’s one choice it’s going to naturally attract all kinds of attention, become a lightning rod.

For example, the deplatform of Donald Trump. Most people agree he constantly broke the rules of all the platforms he was on. Finally, they took him off at that critical moment, essentially meaning that they didn’t want to be handmaid to sedition. What was interesting about that is that I agreed with the decision, but I was disturbed that two people made it. Two unelected people made this decision to deplatform the president of the United States. That’s something, if you think about it.

Sacopulos: In general, I think you’d agree with me that the social media communities do not police themselves very well.

Swisher: No, they don’t police anything very well. I use a metaphor that they own the city, and they collect all the rents, but they don’t provide police and fire protection, stop signs, garbage collection, and they don’t fill the potholes. It’s like the worst-run city in America. You can complain all you want about various cities in America, but those cities look fantastic in comparison to these platforms.

I think you wouldn’t put up with that for 14 seconds in real life. I wouldn’t.

Sacopulos: Give me some ideas on how we can involve physicians to make them part of the conversation or to get lawmakers to make changes here, to get some degree of accountability, because it is a matter of life and death.

Swisher: It is indeed. Start with the premise that you’re never going achieve a perfect society like that. You’re always going to have people who go back to the old adages like, “Drink vinegar. That’ll cure cancer” or whatever it happens to be. My grandmother had 90 different solutions to all kinds of illnesses. For migraines, she had something with oil and a knife under the bed.

But I think one strategy is strengthening the public’s view of science as a trusted institution. One of the problems is that it’s been claimed that science can’t make mistakes. It makes mistakes, and that’s very obvious to a lot of people. So how do you then get people into the mode of thinking that this is a group that’s trying to help you and trying to figure it out, and sometimes they’re wrong, but they’re almost always right on things.

These platforms can’t be the only place for this information. The government has to do a better job of disseminating information far and wide. Doctors have to realize that. Someone in the government told me the most effective way to convince people to get a vaccination is by hearing it from the doctors themselves, who have a close relationship with their patients and take the time to talk to them.

Lobbying Congress to break up some of these companies and encourage innovation may result in more ways to disseminate information that isn’t just confined to one or two companies. Google has 97% of a search market. Are they a utility? Yeah, kind of. Facebook has a huge share. Then of course, there’s public pressure. Amazon recently stopped selling a chemical that people used for suicide after much media and public pressure.

That’s another way to do it. Tell your stories, tell the stories of what can happen if we don’t/do take action. And then talk a lot about the idea that there should be more choices for people to get their information. And it shouldn’t be a single place. But if it is a single place, the people in charge have to do a better job of moderating what’s happening there.

Sacopulos: Do you think that people will want multiple channels? We segregate and pick one media channel, even though there may be many available, not unlike Sunday morning being the time of the week that we’re most segregated from one another. Right? Do multiple platforms and channels matter, or are people just self-segregating?

Swisher: That’s a problem. Multiple platforms and channels could matter. It’s hard because people do self-segregate, but they’ve done that since the beginning of time. I don’t think that’s a fresh new thing. I just don’t know what can be done about that; that’s human beings, right? We certainly don’t have to give them more tools to be able to separate them from the way they have been.

Sacopulos: You have your finger to the wind on this. Tell me, do you feel like we’re making any progress or are we still right in the early days?

Swisher: One thing that’s great about tech is the big leaders die eventually. AOL used to be the thing, but I don’t know what happened to AOL or to Yahoo. Look at Facebook’s recent earnings; there are problems there, lots of problems besides investing in the Metaverse.

Metaverse is a good example. Maybe we should have rules about that before we get there instead of letting them make up the rules as we go along. As we move into deeper AI, maybe the government should be more involved. Maybe we should think more about safety and apply more pressure.

The issue is that the companies don’t have to do anything; they are not compelled to do anything. If they’re not compelled, everyone’s like, “Oh, how could they do that?” I say, “Well, because we create a system where they can.”

People are arguing about how Elon Musk, or whoever, is paying taxes. Musk is taking advantage of the system that exists. What do you want him to do? People say, “Well, he should be a better person.” Maybe he should. Maybe he shouldn’t. I don’t know. These are people availing themselves of the system as it exists. If we want a better system, we need to create a better system.

At this point, I’d love to blame the tech companies, but ultimately it comes down to us and the government. We choose to do these things. We get the results we deserve if we don’t have the laws in place to guarantee the outcomes that we want.

Taxes are a good way to look at that. The government taxes people’s income, not their stock. Well, change that. I assume you get more from income than from stock. Because you’re a doctor with high income, you’re paying the load. Right? Well, change that so that everybody has to pay, no matter what the source of wealth might be.

Rich people shouldn’t be able to just buy stock, borrow from it, and never pay tax on it. Pass a law to change that. It’s the same thing with social media. They should moderate better, and if they don’t, they should be liable. Make it a law. That’s what democracy’s all about. Unfortunately, it’s achingly slow, and in this case, quite damaging. We’ve seen so clearly around public health how harmful lagging legislation can be.

Sacopulos: I’ll let that be the last word. Kara, thank you so much for your time.

Michael J. Sacopulos, JD

Founder and President, Medical Risk Institute; General Counsel for Medical Justice Services; and host of “SoundPractice,” a podcast that delivers practical information and fresh perspectives for physician leaders and those running healthcare systems; Terre Haute, Indiana; email: msacopulos@physicianleaders.org ; website: www.medriskinstitute.com


Kara Swisher

Kara Swisher is an author, technology columnist, and podcast host. Swisher has been on the tech scene since the early 1990s. She has written for The Wall Street Journal and The Washington Post, and started the All Things Digital Conference with Walt Mossberg from the Wall Street Journal in 2003. Today, Kara Swisher is an editor- at- large for New York Media, a contributor to The New York Times, and a host of the podcasts Sway and Pivot.

Interested in sharing leadership insights? Contribute


For over 45 years.

The American Association for Physician Leadership has helped physicians develop their leadership skills through education, career development, thought leadership and community building.

The American Association for Physician Leadership (AAPL) changed its name from the American College of Physician Executives (ACPE) in 2014. We may have changed our name, but we are the same organization that has been serving physician leaders since 1975.

CONTACT US

Mail Processing Address
PO Box 96503 I BMB 97493
Washington, DC 20090-6503

Payment Remittance Address
PO Box 745725
Atlanta, GA 30374-5725
(800) 562-8088
(813) 287-8993 Fax
customerservice@physicianleaders.org

CONNECT WITH US

LOOKING TO ENGAGE YOUR STAFF?

AAPL providers leadership development programs designed to retain valuable team members and improve patient outcomes.

American Association for Physician Leadership®

formerly known as the American College of Physician Executives (ACPE)