An Interview With Tech Activist and Author Cory Doctorow

CR spoke with Doctorow about online interoperability, the ACCESS Act, why Facebook posts are like shoelaces, and more

Illustration of a woman emerging from a globe into app universe Illustration: Jason Schneider

Imagine that you could take all the connections and information that you’ve built up on Facebook, and move them seamlessly to another social media platform that you prefer. You could even continue to interact with your Facebook friends from your new perch.

Right now, you can’t do that. And as science fiction author and technology activist Cory Doctorow explains below, that tends to lock you in. Switch to another platform and you have to build your network again from scratch, and the people that you want to connect with might not be on the new platform.

Mandating that kind of interoperability is one of several reforms that a group of Big Tech critics in Congress announced on Friday. Interoperability is at the core of a bill called the ACCESS Act (PDF), which would require a handful of large social media companies to allow smaller companies to integrate with their platforms.

It can be hard to see what this would mean for consumers—so we asked Doctorow, who often writes speculative fiction about the future of technology, to help us imagine a world where people could pack up and leave a social media giant in favor of a smaller spinoff.

Doctorow and I spoke about his short story, “Inside the Clock Tower: An Interoperability Story,” which appears on CR’s Digital Lab website. The main character is a comics creator who faces nonstop abuse on Twitter. The story sketches out a path for loosening big companies’ grip on social networking. You can listen to our conversation, or read the transcript below, which has been edited for clarity and length.

Listen to CR's Interview With Cory Doctorow

Kaveh Waddell: Cory, start us off in the right direction. What does it mean for technology to be interoperable, and why would it even matter to a consumer?

Cory Doctorow: Interoperability is pretty deep in everything we use. One of the reasons that it might be confusing is that it seems kind of obvious. The person who sells you your shoes doesn’t get to tell you what shoelaces you can use—any piece of string will work there. The company that sold you your shoes might be like, “Well, if you use that old gross string from the kitchen drawer, it could break and you could trip and break your nose.” And you can just say, “I’m a big boy. I’m going to stick my own string in my shoes.” That’s interoperability.

More on tech and consumer rights

The reason we talk about interoperability now is that a lot of companies have taken steps to make sure that you can only use their shoelaces when you buy their shoes. You see that a lot with printer ink, apps from an app store, or games for a console. Any app that is built with Apple’s toolkit will run on an Apple phone—but unless the app is sold through the Apple Store, it refuses to run on the phone. It’s not that it’s not technically capable of running it; it’s more like “2001: A Space Odyssey,” and your phone is saying, “I can’t let you do that, Dave.”

KW: You wrote a fictional story about a comic book author named Nur who basically has to be on Twitter for her job, but she’s getting brutally harassed by a mob of terrible people online. So far, very realistic. What happens next in your alternate universe?

CD: Let me set this up a little by talking about the online economics of why services like Twitter and Facebook and Google have gotten so big and control so much of our lives.

Economists are really interested in this thing called network effects, which is when a service gets better the more people who use it. So, if you remember your first fax machine, if no one else had a fax machine you probably didn’t have much fun with it. But then there came this point where if you didn’t have a fax machine, you were left out.

And that’s where Nur is: She has to use Twitter because she’s a comics creator, and Twitter has moderation policies that are aimed at a very large number of users. But she’s in a minority with a special use case, which is that she’s subjected to a specific kind of harassment. And she’s caught on the horns of a dilemma because the network effect means that if she doesn’t use Twitter, it’s kind of career suicide, but if she does use Twitter, she has to endure a lot of, as you say, horrific abuse.

Now the thing about network effects is they’re not the whole story. There’s another economic concept that doesn’t get nearly enough play, but it’s actually, if anything, even more important: It’s called switching costs. Switching costs are what you have to give up to leave a product. So Nur could leave Twitter, but if she did, she would have to leave behind her career, because not being part of the conversation means that she’s not visible to readers and to publishers and to people who might give her jobs.

There’s nothing intrinsic about Twitter that gives it that high switching cost. We’re all familiar with services that are operated by more than one company—what we call federated services.

Say you had Verizon, and Verizon didn’t do a good job of covering your block with mobile service. Well, you could switch to T-Mobile, and the switching cost is pretty low. Your friends don’t care if you’re on Verizon or on T-Mobile. They don’t log into the T-Mobile service to speak to you. And if you’re on Apple Mail and your friend’s on Google Mail and you’ve got another friend like me who is some weirdo who runs their own mail server, you don’t have to care about any of that. You just send people an email and it arrives. That’s called a federated service.

In the story, Twitter has been forced by law to federate. It’s been forced to open up its service so that other people can continue to use Twitter without having Twitter accounts—they can go somewhere else and exchange messages with Twitter. When you lower switching costs, it means that people’s network effects are no longer so important to them that they’re willing to endure bad things in order to hang in there in a bad situation.

My grandmother was a Soviet refugee who left behind everyone and everything when she left Leningrad after the war. It was a big cost to pay. Most of her family weren’t willing to pay it—she lost touch with them for 15 years. When I moved from Canada to the United States, to the United Kingdom, and back to the United States, my switching costs were moving vans and immigration lawyers, not losing touch with my family for 15 years. So it was a lot easier for me to make that choice, to go where there were jobs for me, where there was opportunity for me, and so on.

When we lower switching costs, when Mark Zuckerberg tears down that iron curtain around Facebook so that you can leave Facebook without leaving behind your Facebook friends, then a lot of people who don’t like the way Facebook handles their data, or the way that Facebook’s algorithm works, or the way that Facebook’s content moderation works, might leave Facebook and still stay in touch with their old Facebook pals.

KW: In the story, the Federal Trade Commission had to step in to make sure that this would be possible. And that telephone example might not be the reality that we live in today if it weren’t for some amount of regulation. The AT&T monopoly was compelled to make room for the Baby Bells. How do you think the government needs to act to make the world of your story possible?

CD: Well, I think that it’s a three-legged stool. One leg of the stool is about to come—but the other two are a ways away.

The first leg of the stool is that we should just make the companies open up their services—and we’ve got experience doing this with law. In 2012, the voters of Massachusetts went to the ballot box and they voted overwhelmingly in favor of a right to repair bill that forced automakers to furnish independent mechanics with the data that they needed to access the diagnostic information from cars, so that they could fix the cars. A bill that’s more broad is pending in the New York State legislature as we discuss this—so we’re pretty familiar with that kind of mandated interoperability.

Portrait of Cory Doctorow
Cory Doctorow

Photo: Jonathan Worth Photo: Jonathan Worth

But the Massachusetts story tells you also how this can go wrong, because what the bill said was that all of the data that flowed on the wired network in the car, what’s called the CAN bus, had to be available to independent mechanics. So as soon as the bill passed, the automakers retooled so that all the useful data went on the wireless network in the car. And they said, “Well, it’s not covered by the law.” Now, eight years later, voters went back to the ballot box, voted on another ballot initiative in 2020 overwhelmingly in favor of it.

So in addition to a mandate, you need something to do if the big companies subvert the mandate, if they sabotage the mandate. And that’s where something called competitive compatibility, or “ComCom,” comes in. ComCom is a term we came up with at the Electronic Frontier Foundation to describe things like scraping, reverse engineering, using bots and all the stuff that people have done historically to make one thing plug into another thing, even when the people who make that other thing don’t want you to.

So if you’ve ever used Mint, say, from Intuit—when Mint started out, the way that it worked was you would tell it what your banking log-in and password were, and it would have this autopilot browser that would log in to your bank pretending to be you and scrape all your finance data, copy it and then put it into the Mint website, so you could look at all the data from all of your banks in one place. That’s ComCom.

Now, imagine if ComCom were something that were available in the auto repair situation in Massachusetts. As soon as they moved all the data to the wireless network, you could just reverse engineer how the wireless network worked.

The reason they can’t do that is because as these companies have attained dominance, they have changed the laws so that reverse engineering, violating terms of service, and doing all kinds of stuff that used to be just completely normal has become an actual felony—the kind of thing you go to prison for.

So as we go after these big monopolists, we need to start thinking about how we stop them from subverting their mandates. Maybe we could pass a law that just restores competitive compatibility. We could create an interoperator’s defense, where if you get sued for patent or copyright or circumvention or violating terms of service, you just say, “Look, judge, I can show you that I did this for a legitimate purpose. I can show you that I did it to make an interoperable legitimate product.” And maybe that would let you off the hook.

And then the third leg of the stool is a privacy law, because with interoperability, you start to open up new data flows from these big data-hungry silos like Facebook and Google, or even your cars.

When the 2020 campaign for the right to repair bill of Massachusetts was running, automakers ran these incredible scare ads where they were like, “Your car gathers so much data about you that if independent mechanics can see it, they will murder you.” And there were like literal videos of women being followed home and murdered with the data from their cars.

No one ever said, “Hey, if the data in the car is so sensitive that it would literally lead to your death, maybe your car shouldn’t gather so much data on you!” This shows you that the right place to decide how much data your devices should gather on you is not the corporate boardroom. It’s a democratic legislature.

So we shouldn’t rely on Facebook to decide who can and can’t plug into Facebook’s service—we should just have rules. And those rules should be enforced by the law, and not by the whim of a corporate boardroom, because Facebook protects your privacy until they don’t. All that stuff Cambridge Analytica did was stuff that Facebook let them do. It was stuff that they were allowed to do under Facebook’s rules.

And so that’s our three-legged stool. Mandated interoperability, competitive compatibility, and a privacy law: Put those three things together, and you have a way to tame Big Tech.

KW: In the absence of rules like this right now, are any companies making any meaningful moves toward opening up products and services?

CD: Well, I guess it all depends on what you mean by meaningful. There’s a lot of happy talk about this stuff—some of it is stuff that I welcome and take seriously; some of it is stuff that I take seriously but don’t welcome at all.

An example of the former would be Twitter’s Project Bluesky. Twitter envisions an app store for moderation, where you choose the house rules that suit you best and it plugs into Twitter in some way, blocks the stuff that the house rules that you’ve chosen blocks, and lets you see the stuff that the house rules you’ve chosen let through.

So if your definition of harassment is a lot more hair-trigger than Twitter’s, you can find a much more anti-harassment environment. On the other hand, if you like the rough and tumble, maybe you go to a place where the definition of harassment is a lot broader and somehow the platform or the protocol mediates between it.

That’s in very early stages, but they published some papers and they hired some people. And if anyone’s going to do it, it’s probably them—because even though they’re huge, they’re an order of magnitude smaller than Facebook. They’re big, but they’re not galaxy scale. So they need to find a way to differentiate themselves from Facebook to do something that Facebook can’t even think about doing, and I think that’s where they’ve landed. And so there is some progress, but it’s a mixed bag.

KW: There’s a bill that’s just been reintroduced that would address some of the questions that we’ve been talking about: It’s called the ACCESS Act. Would it nudge things in the direction of the world in your story about Nur, and does it bring all three legs to the stool?

CD: It tries to bring two of them. As it stands, the ACCESS Act says to a certain group of companies—a handful of large communications platforms, companies worth $600 billion—that they have to have a committee that includes their competitors, that includes privacy advocates and academics, people from the National Institute of Standards and Technology and other independent actors, as well as their own engineers. And they have to design a standard way for interoperability to take place with their competitors and then their competitors get to access it.

It says that if you jumped ship from Facebook to Baby Facebook—a new Facebook—you can take your data with you. And when there’s data that implicates someone else, like if someone else is tagged in your photos or if someone else replies to your message, they have to give consent, too. There are bad ways of doing that, like an endless stream of dialogue boxes you have to click: “Yes, I agree. Yes, I agree. Yes, I agree.” And then there’s good ways of doing it, where you can use Facebook as a consent management platform. They already get a lot of consent from you to do a lot of stuff, they could get some more as well. So the privacy stuff is good, and includes some definitions about what the new platforms can’t do with your data, certain kinds of commercialization is off the table.

But it’s not perfect, because it doesn’t have what’s called the private right of action. That just means that if the rules are broken, you don’t get to decide whether or not to go after the company for breaking the rules; you have to convince the attorney general to do it. There’s a lot of people who don’t like private rights of action—oftentimes, people who think of themselves as politically conservative don’t like them, because they worry about frivolous lawsuits against big business. But if you want justice to be widely available, you can’t make it only available to people that the attorney general thinks matters.

In other respects, it’s exceptionally well-thought-through. It has really big penalties for failing to comply. There’s this old saying that a fine is a price, so if you can make $1 billion by cheating and then only get a $50 million fine, why wouldn’t you? And the fines in this case, it’s hard to imagine the company shrugging them off—even really big companies. So that’s pretty cool.

But it doesn’t have a privacy law of its own—and it doesn’t rest on another privacy law. It tries to describe what privacy is on its own, and it has like this thing where it’s like, “You can’t do bad things with data.” And then further on it says, “Within 180 days of the passage of this law, we will decide what data means.” Which, you know, it’s a hard problem—but at the same time, that’s a heck of a can to kick. I favor the bill and I favor its passage, but I also think that when it passes, we’re going to have to keep a pretty close eye on how they define data. We wouldn’t have to, if we just did the sensible thing and had a privacy law, which we really should.

KW: Given that the companies have effective, often aggressive lobby shops around this, what kind of chances do you see for the ACCESS Act?

CD: Well, it’s this rare moment where there’s some bipartisan support. I mean, I think in both cases, both from the Democrats and the Republicans, some of the support is very principled, but a lot of it is pretty instrumental. I think there’s a lot of people calling for antitrust action on Big Tech on the Republican side who, if they just changed their moderation policies, would be like, “Nah, it’s okay.” And I think there are a lot of Dems who, if the companies would just pay their taxes, would probably be like, “Yeah, fine.”

But I also think that there’s a core on both sides who really, really, really understand the underlying issues, who understand that technological self-determination is too important to allow this kind of parochial market dominance. That we really need to approach this with a very robust, old-fashioned way of thinking about monopoly.

We need to really move back to that harmful-dominant standard that started with the trust-busting era, where we concerned ourselves not just with prices, but with all of the ways that allowing a lot of power to be gathered into a small number of unaccountable hands distorts our democracy. Not just our markets, but our very democracy.

KW: Well, Cory, thanks so much for talking all of this through. I really appreciate your time.

CD: Oh, it was my pleasure as well. It was lovely to chat with you.


Headshot of CRO author Kaveh Waddell

Kaveh Waddell

I'm an investigative journalist at CR's Digital Lab, covering algorithmic bias, misinformation, and technology-enabled abuses of power. In the past, I've reported for Axios and The Atlantic, and as a freelancer in Beirut. Outside work, I enjoy biking and hiking in and around San Francisco, where I live, and doing the crossword while cheating as little as possible. Find me on Twitter at @kavehwaddell.