Abstract:
This engaging SoundPractice episode is a tutorial for physicians on the tech landscape, the lack of accountability for tech companies, and an overview of Section 230 of the Communication Decency Act. Host Mike Sacopulos and Kara Swisher also discuss how social media has turbo charged the anti-vax messaging putting physicians on the front lines with patients, how platforms such as Facebook, YouTube, Amazon and Reddit proliferate the disinformation, and what physicians and healthcare professionals can do to help stem the false information.
This transcript of their discussion has been edited for clarity and length.
Mike Sacopulos:
Recently, incoming FDA Commissioner Robert Califf announced his concern that misinformation about science is increasingly prevalent. Social media, whether intentionally or unintentionally magnifies, propagates, and disseminates misinformation quickly. Mark Twain famously said, "A lie can travel halfway around the world before the truth puts on its shoes." If Twain had Facebook, that lie would've made it further.
In this episode, we speak with a national expert on the impact of social media. Kara Swisher is a thought leader and journalist par excellence prepared to tackle one of the vexing issues of our age. Author, technology columnist, and podcast host, Kara Swisher, has been on the tech scene since the early 90s. She has written for The Wall Street Journal and The Washington Post. She started the All Things Digital Conference with Walt Mossberg, also from The Wall Street Journal, back in 2003. Later, she started the website Recode, which Vox acquired in 2015. These days, Kara Swisher is an editor at large for New York Media, a contributor to the New York Times, and hosts of the podcast, Sway, and Pivot. Kara, welcome to SoundPractice.
Kara Swisher:
Thanks very much.
Mike Sacopulos:
SoundPractice is a podcast to educate physicians on issues that have an impact on their patient care and their positions as physician leaders. Certainly, platforms such as Facebook, Twitter, and Google have substantial control over information Americans see. And I know you've been vocal about the lack of regulation surrounding tech companies and the content that they are producing. And of course, what comes to mind is COVID-19 misinformation, right? We know the playbook, it's, COVID isn't dangerous, the vaccine is dangerous, and you can't trust doctors or scientists. How did we end up here?
Kara Swisher:
Well, I think it's been part of a long stretch of the ability of these companies to act as platforms and not as media companies. They are media companies in many ways, and they get the protections of platforms, which come from a law that was passed many, many years ago called Section 230. The companies that are running these platforms have no responsibility to put out accurate information and they become de facto news organizations and it’s how people get their information. Whether they like it or not, they're media companies. You've seen it just recently around Joe Rogan and Spotify.
In the case of Facebook, they don't pay most of the contributors and they certainly do make editorial decisions. It's been growing for a long time. I just did an interview today that was posted with Lenny Pozner, who's one of the parents of the kids that were murdered at Sandy Hook Elementary School. He called it the canary in the coal mine of what was happening. This was someone who's been around and continues to make a lot of trouble online, Alex Jones, who started to posit that this was a hoax, and this tragedy was a hoax and that they were child actors. Most of that happened online and was joined by a lot of people in what Mr. Pozner called a giant party to do detective work on something fake, including putting up fake memes.
Going from there, the platforms did nothing. Pozner has struggled to get them to take the false information down, which was huge. This is the life of one person, false information about himself, actually accurate information about where he lived, so he got docs, etc. And then the worst thing that could ever happen to this man happened. He was continually bullied and demonized and attacked. The tragedy just added onto itself, and these platforms did nothing to stop it. It should have been a warning signal to us what was happening there. It took them forever to get Alex Jones off the platform, pretending it was a free speech issue, which it is not, it is a false information issue. It's about a lot of things, but the companies like to wrap themselves in The First Amendment.
So here we are today, same thing, except they're better at it. Different forces are doing it, this time it's anti-vax people. You can raise all kinds of issues around your doctors. I certainly have been a pain to my doctors, but not to the level that this has happened. Not in questioning basic facts and science. Again, you can argue about masks, the efficacy of masks. You can argue about mandates. That's certainly a worthy thing for a democracy to argue about. You can argue about the importance of public health over individual responsibility, over individual freedom. That's worthy of democracy to discuss that.
But what happens is they flood the zone with all kinds of misinformation or give equal time to patently false things. And then do nothing about it. Spotify is the perfect example of this. They're trying to pretend they're just a platform, but they have a media relationship with someone, Rogan, who has put forward some speakers that he didn't push back on, that he just allowed them to say whatever they said. He has a very powerful platform. The issue is with the platform itself and not necessarily with Joe Rogan. Many people like Joe Rogan. The continuum of these platforms allowing this to happen and then we do nothing about is a sort of really fascinating way that we allow the technology to take over ourselves and control us.
Mike Sacopulos:
Facebook now META has a market cap of 800- 900 billion, although that's dropped a little bit recently. Facebook and other tech companies have reported that they just can't stay on top of all of the disinformation. It seems they would have the resources or be able to deploy solutions to stay on top of it. Is the situation that they can't or that they just don't want to?
Kara Swisher:
Well, it's complex obviously. But the fact of the matter is they've never been more lucrative. Really? You can't pay for this? They don't want to pay for it. It's costly to do what they need to do. At one point Mark Zuckerberg, one of the founders of Facebook, said, "You can't control this. It's very hard to do. You can't moderate it as easily." And I remember saying to him, "Why did you build a platform you can't moderate? Too bad, it's your responsibility, it's on you."
They would like to have it both ways. They'd like to be able to say, "It's really difficult, and we can't do anything about it, and you can't make us do anything about it. And this is the way it's built." That's just cowardly in so many ways. I don't think they don't want to; they've made all kinds of efforts to do so. Ultimately, it's the architecture of the situation, which allows no editorial control, except when they feel like it. They do editorial all the time. That's the irony. They're like, "We're not editors." I'm like, "Well, why did you take this out? Why did you take that out?”
Zuckerberg changed his mind on Holocaust denials. One day they were fine, one day they weren't. They are at the mercy of this guy who just changes his mind, but never wants to be accurate. Right? Accuracy is not the goal at all, for some reason. There's a trial, with The New York Times, right now, with Sarah Palin. Obviously, you know what side I'm on, but the fact of the matter is they're in trial. The NYT made a mistake, they corrected it and they're still on trial. That's a good thing.
Under the laws that we have, there are no ways for anybody, including someone like Lenny Pozner to fight back against any of this. There’s the inability to sue them, no liability, free and unfettered power, and no regulation. We're at the mercy of one or two people to tell us what our society should do. I think that's a prescription for disaster.
Mike Sacopulos:
You alluded to Section 230 of the Communications Decency Act earlier. That is where we're headed right now, can you tell our audience a little bit about what that section says and why it's problematic?
Kara Swisher:
Yeah. It's part of a law. It's not problematic. It's a good thing too. It's a law that was passed many years ago in either the 80s or 90s. It was part of a larger Act, much of which was declared unconstitutional. But this particular part was to help nascent internet platforms moderate, that they could do the things necessary to keep the place clean, essentially, because if they had all this third-party information on there and they weren't responsible for it and therefore they couldn't be sued. As opposed to a media company, which paid and delivered the content to you. Whether it's a book publisher or a newspaper or whatever, there is liability.
It's hard to sue a newspaper, but you can still do it, for example. Anyway, everybody has liability. If you have a bumpy sidewalk, you have liability. Everybody has liability. In this case, the companies were protected from liability. And so, it allowed them to be able to moderate and to grow without the threat of legal repercussions. So, it allowed these companies and this industry to grow.
The law protects not just internet companies by the way, but mostly it's been to the benefit of internet companies to be able not to be sued for things that are on their platform, that they're not "responsible for." It's very short, it's a very short thing. It's had some parts etched out of it around child trafficking and sex stuff and people, they call it a slippery slope. That's people's favorite term when they don't like changing things.
So, it seems like they may try to eke out more on health information maybe. And some other things when people are blatantly wrong and stuff like that, it's difficult to see what's going to happen in that, but certainly removing it is a problem too, because then it opens up a floodgate of legal action. It's easier to sue the bigger people and not the lesser, smaller companies that could still get hurt by the lack of Section 230.
Mike Sacopulos:
One suggestion has been to say that algorithms are not covered under Section 230, correct?
Kara Swisher:
Yes!
Mike Sacopulos:
Maybe the content, that is placed by a third party onto a platform would be covered, but how it is displayed or prioritized by the platform would not come under the 230 exemptions. How do you feel about that?
Kara Swisher:
It's interesting. Like any law, there can be real problems as you put laws in. One of the ones that I have more problem with is the one where one agency of the government, I think it was the Health and Human Services gets to decide what's science. Well, the last administration loved the bleach. So, I don't know if we want that to happen, right? Because there's a different HHS Secretary and I don't want to put it in the hands of a government agency to say so, but there are probably ways to decide it. That's an interesting way, adding liability is an interesting way, in certain ways, in certain instances. You can try to figure out where it goes.
The problem is this stuff goes in every direction if it's healthcare today and COVID, and back then it was child actors acting out a massacre, which was an obvious massacre, or the Holocaust. It can go lots and lots and different... Election lies. What do you do about Trump putting out election lies? Well, they pulled him off the platforms, which is editing, right? They edited him off the platform, whether they like to say it or not. I think they used the concept of deplatforming, but it is editing. "We've had enough of you, and you’ve broken our rules and you're gone." It's more like kicking someone out of a bar, I guess. Not because it's a private company, "Get out of our bar. We don't like you. Or you broke our rules, or you broke whatever, you broke a table."
It's going to be difficult. But the government hasn't done anything. In Europe, they're doing a lot of things. They're in a big beef with Facebook over servers, where servers are located, around privacy. Often in Europe, it's around privacy issues, which is, I think where a lot of the US stuff should focus because then it sort of gets at the heart of their business model and they may change in other ways if you aim at their privacy issues. But in Europe, they want to put the servers there and Facebook is threatened to leave and Europe's like, "Bye don't let the door hit you on the way out," kind of thing.
We'll see where that goes, because, around the world, there's certainly an awareness that this is an important issue, but algorithms are one way to do it. There are lots of different suggestions, lots of laws, lots of enforcement. Giving money to our enforcement agencies would be nice. I just interviewed a major government official who said, "We're too small to fight these companies." Honestly, the government? The US government's too small to fight these companies? Think about that.
Mike Sacopulos:
Is that the inverse of too big to fail, too small to regulate?
Kara Swisher:
Yeah. I think she was noting, she's the head of the FTC, Lina Khan. She's been a critic of tech companies, they are trying to get her recused of course, but that's why she got hired because she's a critic of tech companies. She was saying and correctly noting the number of mergers has just ballooned. She's got a staff that's smaller than the 1980s, I think, something like that. Money is stuck in Congress. They can't pass basic laws. They say everything's sucked up into these big bills. It was in one of them, I forget which one, it sort of stuck there, the money for enforcement for both them and the justice department. You can only use so much when they have more lawyers and more PR people and more everything.
Mike Sacopulos:
Maybe it's just stating the obvious, but it seems a bit dangerous to have the enforcers publicly admit that they're having an inability to properly enforce. Right? That's almost a green light to go forward and do what you like.
Kara Swisher:
That's what they're doing. They just flooded the agency with all kinds of mergers. At some point, the agency's got to pick one and make an example of it, but what if they lose? Right? So, you have to be super careful and it makes for a very cautious agency because you don't want to blow it on one bet, is the problem, but they can't make more than one or two, kinds of things. I don't know. They also can outlast these people. It's not like things could change in the midterm elections. They take advantage of all the good things about our government, which is that people get elected again.
My issue is that all these companies are unaccountable, and you may hate your politicians, whatever side you're on. You may think politics stink, but at least politicians are elected. That's what I say. Even if there are all kinds of corruption involved, even if they are for the most part elected, they're elected officials. Nobody elected Facebook, except by using it, right? Except that's not an election. That's something else. That's a choice. It's a consumer choice.
Mike Sacopulos:
Emergency room physicians are certainly now on the front lines for the anti-vax rhetoric coming from unvaccinated patients and pediatricians have been dealing with this for many years. The anti-vax movement certainly is not new. And people somehow forget that we've had outbreaks of measles in Oregon and Orange County, California, all tied back to the anti-vax messaging. So, it appears that social media has turbocharged all of this. Would you agree with that statement?
Kara Swisher:
Oh yes, 100%. That's the thing, they've gotten turbocharged by the Internet and the ability to use these tools. These are malevolent players and they're following a playbook and they're getting better at it. They pop up here, they pop up there. It's hard to regulate them. Look, Facebook, and others have tried hard. It's just that they waited too long. On one hand, it's great for people to protest, whatever they feel like protesting, on the other, it creates the ability to put false information out.
I don't mind if people have an argument, "Here's my take and this is what it is." And I'm sure doctors hate it. I'm sure they hate it. And you can only do so much with people, but there is such a thing called the public good and public health and we all make agreements by not going through a stop sign or not using red lights.
Mike Sacopulos:
We've seen some specific targeting that is frightening, moving in upon monstrous. An example is the Orthodox Jewish community and the Somalian community where they were targeted specifically with misinformation to these communities about vaccination. And so, we've seen incredibly low vaccination rates and correspondingly higher infection rates. Any wonder why physicians have difficulty gaining trust around vaccination in this kind of community?
Kara Swisher:
Well, I don't know. Again, issues around religion are very difficult to parse out, right? I think these would be an issue no matter what. They have more and more information to bolster, what are false claims, right? That there's so much false information, it's hard to fight against it. I don't know what you would do, what you could show to people, you can show them numbers and statistics, but... I have a relative whom I showed numbers and statistics to, "Well, that's your opinion." I was like, "No, these are the facts."
What do you do with, "This is my belief?" I think probably religious people are, "This is my belief." And then I'm not sure what you do. You have to get the government involved. And these people have to be isolated, I guess. I don't know. That's the discussion we need to have. If you're not going to join in something that's a public health emergency. That's something our public officials have to start to talk about. Of course, that's where the real problems are. You go to Florida; you have one set of rules. You go to New York; you have another set of rules. You go to San Francisco; you have another set of rules. So, I'm not sure it’s ever going to be resolved. It's just been made more powerful by internet tools for people to be able to press their particular case.
Mike Sacopulos:
Some social media platforms, Twitter, Reddit have started to do more to stem the flow of misinformation, particularly during the pandemic. Twitter's labeling manipulated media and multiple platforms have now reporting options that things can be flagged. Isn't this somewhat of a game of Whack-a-mole?
Kara Swisher:
Yes. That's exactly what it is. There's no other way to put it. That's what it is. It's a game of Whack-a-mole. And so, people get more and more sophisticated in how to manipulate these platforms. I would not want to have to deal with this, but they need to impose stronger editorial rules on this stuff, and sometimes they'll get it wrong. They're fearful of getting it wrong, more than they are fearful of removing wrong things from the platform. That's happened a couple of years ago when Facebook was taking down breast cancer information.
You can make mistakes, then you correct them. One of the biggest problems is you don't have a lot of choices, that's an issue around power and consolidation. And that's again, a thing for our government officials to deal with. Why are these companies allowed to get so big, there are no other options? When Parler got knocked off of Amazon and Google and Apple for violating its rules, you got to wonder why it was Parler and not Facebook since, as it's turned out, Facebook was one of the biggest purveyors of misinformation around January 6th, right? They were the tools and the purveyor of misinformation and the gathering point. And Parler didn't behave right either, but why them versus Facebook, think about that.
I don't agree with most of these people and think they're heinous in so many ways, but at the same time, well, that's a very excellent question. Should you build more and more things so that people can have their platforms to disseminate their information? I think concentration's a real problem.
Mike Sacopulos:
Right. And it seems like another alternative is to break up the key platforms, right?
Kara Swisher:
Yes.
Mike Sacopulos:
Not unlike what we did, with MaBell where we broke it into different telephone numbers.
Kara Swisher:
There's are no choices. Trump is supposedly creating Trump Social. We'll see. It doesn't exist as far as I can tell, but there's a lot of them, there's Rumble and GETTR and Parler and MeWe and stuff like that. And you can have whatever political side you want to be on, but there should be lots of choices. That's to be the best way to deal with this stuff, when there's one choice it's going to naturally attract all kinds of attention, become a lightning rod.
For example, the deplatform of Donald Trump, look, most people agree he broke the rules of all the platforms he was on, constantly. Finally, they took him off at that critical moment, took them that long to take them off. For that reason, meaning that they didn't want to be handmade to sedition essentially. What was interesting about that is that I agreed with the decision, but I was disturbed that two people made it, two unelected people made this decision to deplatform the president of the United States. That's something if you think about it. I don't know what to do about it, but that's something, except maybe there should be 20 places, then that would be a little fair.
Mike Sacopulos:
But in general, I think you'd agree with me that the social media communities do not police themselves very well.
Kara Swisher:
No, they don't police anything very well. I use a metaphor constantly, which is they own the city, they collect all the rents, and they don't provide police, fire, stop signs, garbage, any kind of safety issues. They don't fill the potholes. It's like the worst-run city in America. Right? You can de-clean all you want about various cities in America, but they look fantastic in comparison to these cities. And then you're on your own, like, "Good luck. Have a good time. Sorry, if you get killed." Do you know what I mean? That kind of thing, "Sorry, if you get bad information, sorry, if you get killed. Sorry. Oh, well not my responsibility." I think you wouldn't put up with that for 14 seconds in real life. I wouldn't.
Mike Sacopulos:
So, give me some ideas on how we can involve physicians to make them part of the conversation or to get lawmakers, to make changes here, to get some degree of accountability, because it is a matter of life and death.
Kara Swisher:
It is indeed. Look, start with the premise, you're never going to get to a perfect society like that. You're always going to have people that think, going way back, like, "Drink the vinegar that'll solve cancer, whatever it is, whatever it happens to be." Like my grandmother had 90 different solutions to all kinds of illnesses. Migraines, she had something with oil and a knife under the bed. So okay, fine, whatever. But I think one is the strengthening of science as a trusted institution, that one of the problems is, it's been put out like science can't make mistakes. It makes mistakes. And that's very obvious to a lot of people. And so how do you then get people into the mode that this is a group built trying to help you and trying to figure it out and sometimes we're wrong, but we're almost always right on things.
Another is the idea that these platforms can't be the only place for this information that you had, that the government has to do a better job of disseminating information far and wide, about things we have to also live in a full environment. Doctors have to realize that. Someone in the government told me the most effective way to convince people to get a vaccination is by a doctor themselves, a close relationship with their patients, and taking the time to talk to your patients. I mean, I think that is the best way to do it, from what I understand from most doctors.
I think lobbying Congress to break up some of these companies or create more innovation, so there are many more ways to disseminate information that isn't just subject to one or two companies. Google has 97% of a search, right? Are they a utility? Yeah, kind of. Facebook has a huge amount of that. Then there's of course public pressure. Amazon was recently stopped selling a chemical that people used for suicide after a lot of pressure, a lot of public pressure, and media pressure. That's another way to do it. Tell your stories, tell the stories of what can happen if we don't do things like this. And then talk a lot about the idea that there should be more choices for people to get their information. And it shouldn't be a single place. And if it is a single place, these people have to do a better job moderating what's happening there.
Mike Sacopulos:
Do you think that people will want multiple channels? We segregate and pick one media channel, even though there may be many available, not unlike Sunday morning being the time of the week that we're most segregated from one another. Right? Do multiple platforms and channels matter, or are people just self-segregating?
Kara Swisher:
That's a problem, they could. It's hard because people do self-segregate, but they did that since the beginning of time. I don't think that's a fresh new thing. I don't, I think people did, as I do in terrible ways and just normal ways. Right? People like the golf people.
That happens, but I just don't know what can be done about that, that's human beings. Right? We certainly don't have to give them more tools to be able to separate the way they have been.
Mike Sacopulos:
You have your finger to the wind on this. Tell me, do you feel like we're making any progress or are we still right in the early days?
Kara Swisher:
One thing that's great about tech is the big leaders die eventually. And so, AOL used to be the thing. And now I don't know what happened to that or Yahoo. Look at Facebook's recent earnings, little problems there, lots of problems besides investing in the Metaverse. But here's a good example. We're going into the Metaverse. Maybe we should have rules about that before we get there, instead of letting them make up the rules as we go along. As we move into deeper AI, maybe have the government involved in all kinds of stuff. Maybe we think about safety more. We pressure them into thinking about safety.
The issue is they don't have to do anything, they are not compelled to do anything. If you're not compelled, everyone's like, "Oh, how could they do that?" I'm like, "Well, because we create a system where they can." It's sort of when people are arguing recently about taxes, Elon Musk, or whoever paying taxes, I'm like, "He's availing himself to the system that exists. What do you want him to do?" Like, "Well, he should be a better person." I'm like, "You know what? You're not his mama. Maybe he should. Maybe he shouldn't, I don't know." These are people availing themselves to the system as it exists. If we want a better system, we need to create a better system.
At this point, I'd love to blame the tech companies, but ultimately it comes down to us and the government. We choose to do these things. I had an interesting interview with Elizabeth Warren. She was talking about these tax things. I was like, "Well, change the law, then." "It's hard." "Sorry, that's your job. I don't know what to say." I don't think it's the right thing to do. And lots of people don't. But I don't know. We get the results we deserve if we don't have the laws in place to have the outcomes that we want.
Taxes are a very good way to look at that. It taxes people with income, not people with stock. Well, change that. I assume you get more income than stock. And so, because you're a doctor, you're paying the load. Right? Well, change that. So that everybody has to pay no matter what that is. Rich people, shouldn't be able to just buy stock, borrow from it and then never make income and never pay tax, like that makes sense. But pass that law then. So, it's the same thing with social media and everything else. It makes sense that they should moderate better. They should be liable. Well, then pass that law. That’s what democracy's all about. And unfortunately, it's, achingly slow. And in this case, quite damaging, especially, we've seen it so clearly around public health, how damaging it can be.
Mike Sacopulos:
I'll let that be the last word. Thank you very much. My guest on this episode of SoundPractice has been Kara Swisher. Kara, thank you so much for your time.
Kara Swisher has been on the tech scene since the early 1990s. She has written for the Wall Street Journal and the Washington Post, and she started the “All Things Digital Conference” with Walt Mossberg, also from the Wall Street Journal, back in 2003. These days, Kara Swisher is an editor-at-large for New York Media, a contributor to the New York Times, and a host of the podcasts “Sway,” and “Pivot.”
Topics
Technology Integration
Environmental Influences
Critical Appraisal Skills
Related
4 Steps That Can Optimize Your Sales ProcessUsing AI to Enhance Clinical Decision-Making with Dr. Maria GranzottiSurviving (and Finding Ways to Thrive) With Difficult Leader PhenotypesRecommended Reading
Strategy and Innovation
4 Steps That Can Optimize Your Sales Process
Strategy and Innovation
Using AI to Enhance Clinical Decision-Making with Dr. Maria Granzotti
Operations and Policy
Surviving (and Finding Ways to Thrive) With Difficult Leader Phenotypes
Operations and Policy
Shifting from Star Performer to Star Manager
Problem Solving
When Your Actions Surprise People — and Provoke Blowback