An Interview With Tech Activist and Author Cory Doctorow
CR spoke with Doctorow about online interoperability, the ACCESS Act, why Facebook posts are like shoelaces, and more
Imagine that you could take all the connections and information that you’ve built up on Facebook, and move them seamlessly to another social media platform that you prefer. You could even continue to interact with your Facebook friends from your new perch.
Right now, you can’t do that. And as science fiction author and technology activist Cory Doctorow explains below, that tends to lock you in. Switch to another platform and you have to build your network again from scratch, and the people that you want to connect with might not be on the new platform.
Mandating that kind of interoperability is one of several reforms that a group of Big Tech critics in Congress announced on Friday. Interoperability is at the core of a bill called the ACCESS Act (PDF), which would require a handful of large social media companies to allow smaller companies to integrate with their platforms.
It can be hard to see what this would mean for consumers—so we asked Doctorow, who often writes speculative fiction about the future of technology, to help us imagine a world where people could pack up and leave a social media giant in favor of a smaller spinoff.
Doctorow and I spoke about his short story, “Inside the Clock Tower: An Interoperability Story,” which appears on CR’s Digital Lab website. The main character is a comics creator who faces nonstop abuse on Twitter. The story sketches out a path for loosening big companies’ grip on social networking. You can listen to our conversation, or read the transcript below, which has been edited for clarity and length.
Listen to CR's Interview With Cory Doctorow
Kaveh Waddell: Cory, start us off in the right direction. What does it mean for technology to be interoperable, and why would it even matter to a consumer?
Cory Doctorow: Interoperability is pretty deep in everything we use. One of the reasons that it might be confusing is that it seems kind of obvious. The person who sells you your shoes doesn’t get to tell you what shoelaces you can use—any piece of string will work there. The company that sold you your shoes might be like, “Well, if you use that old gross string from the kitchen drawer, it could break and you could trip and break your nose.” And you can just say, “I’m a big boy. I’m going to stick my own string in my shoes.” That’s interoperability.
CD: Well, I think that it’s a three-legged stool. One leg of the stool is about to come—but the other two are a ways away.
The first leg of the stool is that we should just make the companies open up their services—and we’ve got experience doing this with law. In 2012, the voters of Massachusetts went to the ballot box and they voted overwhelmingly in favor of a right to repair bill that forced automakers to furnish independent mechanics with the data that they needed to access the diagnostic information from cars, so that they could fix the cars. A bill that’s more broad is pending in the New York State legislature as we discuss this—so we’re pretty familiar with that kind of mandated interoperability.
But the Massachusetts story tells you also how this can go wrong, because what the bill said was that all of the data that flowed on the wired network in the car, what’s called the CAN bus, had to be available to independent mechanics. So as soon as the bill passed, the automakers retooled so that all the useful data went on the wireless network in the car. And they said, “Well, it’s not covered by the law.” Now, eight years later, voters went back to the ballot box, voted on another ballot initiative in 2020 overwhelmingly in favor of it.
So in addition to a mandate, you need something to do if the big companies subvert the mandate, if they sabotage the mandate. And that’s where something called competitive compatibility, or “ComCom,” comes in. ComCom is a term we came up with at the Electronic Frontier Foundation to describe things like scraping, reverse engineering, using bots and all the stuff that people have done historically to make one thing plug into another thing, even when the people who make that other thing don’t want you to.
So if you’ve ever used Mint, say, from Intuit—when Mint started out, the way that it worked was you would tell it what your banking log-in and password were, and it would have this autopilot browser that would log in to your bank pretending to be you and scrape all your finance data, copy it and then put it into the Mint website, so you could look at all the data from all of your banks in one place. That’s ComCom.
Now, imagine if ComCom were something that were available in the auto repair situation in Massachusetts. As soon as they moved all the data to the wireless network, you could just reverse engineer how the wireless network worked.
The reason they can’t do that is because as these companies have attained dominance, they have changed the laws so that reverse engineering, violating terms of service, and doing all kinds of stuff that used to be just completely normal has become an actual felony—the kind of thing you go to prison for.
So as we go after these big monopolists, we need to start thinking about how we stop them from subverting their mandates. Maybe we could pass a law that just restores competitive compatibility. We could create an interoperator’s defense, where if you get sued for patent or copyright or circumvention or violating terms of service, you just say, “Look, judge, I can show you that I did this for a legitimate purpose. I can show you that I did it to make an interoperable legitimate product.” And maybe that would let you off the hook.
And then the third leg of the stool is a privacy law, because with interoperability, you start to open up new data flows from these big data-hungry silos like Facebook and Google, or even your cars.
When the 2020 campaign for the right to repair bill of Massachusetts was running, automakers ran these incredible scare ads where they were like, “Your car gathers so much data about you that if independent mechanics can see it, they will murder you.” And there were like literal videos of women being followed home and murdered with the data from their cars.
No one ever said, “Hey, if the data in the car is so sensitive that it would literally lead to your death, maybe your car shouldn’t gather so much data on you!” This shows you that the right place to decide how much data your devices should gather on you is not the corporate boardroom. It’s a democratic legislature.
So we shouldn’t rely on Facebook to decide who can and can’t plug into Facebook’s service—we should just have rules. And those rules should be enforced by the law, and not by the whim of a corporate boardroom, because Facebook protects your privacy until they don’t. All that stuff Cambridge Analytica did was stuff that Facebook let them do. It was stuff that they were allowed to do under Facebook’s rules.
And so that’s our three-legged stool. Mandated interoperability, competitive compatibility, and a privacy law: Put those three things together, and you have a way to tame Big Tech.
KW: In the absence of rules like this right now, are any companies making any meaningful moves toward opening up products and services?
CD: Well, I guess it all depends on what you mean by meaningful. There’s a lot of happy talk about this stuff—some of it is stuff that I welcome and take seriously; some of it is stuff that I take seriously but don’t welcome at all.
An example of the former would be Twitter’s Project Bluesky. Twitter envisions an app store for moderation, where you choose the house rules that suit you best and it plugs into Twitter in some way, blocks the stuff that the house rules that you’ve chosen blocks, and lets you see the stuff that the house rules you’ve chosen let through.
So if your definition of harassment is a lot more hair-trigger than Twitter’s, you can find a much more anti-harassment environment. On the other hand, if you like the rough and tumble, maybe you go to a place where the definition of harassment is a lot broader and somehow the platform or the protocol mediates between it.
That’s in very early stages, but they published some papers and they hired some people. And if anyone’s going to do it, it’s probably them—because even though they’re huge, they’re an order of magnitude smaller than Facebook. They’re big, but they’re not galaxy scale. So they need to find a way to differentiate themselves from Facebook to do something that Facebook can’t even think about doing, and I think that’s where they’ve landed. And so there is some progress, but it’s a mixed bag.
KW: There’s a bill that’s just been reintroduced that would address some of the questions that we’ve been talking about: It’s called the ACCESS Act. Would it nudge things in the direction of the world in your story about Nur, and does it bring all three legs to the stool?
CD: It tries to bring two of them. As it stands, the ACCESS Act says to a certain group of companies—a handful of large communications platforms, companies worth $600 billion—that they have to have a committee that includes their competitors, that includes privacy advocates and academics, people from the National Institute of Standards and Technology and other independent actors, as well as their own engineers. And they have to design a standard way for interoperability to take place with their competitors and then their competitors get to access it.
It says that if you jumped ship from Facebook to Baby Facebook—a new Facebook—you can take your data with you. And when there’s data that implicates someone else, like if someone else is tagged in your photos or if someone else replies to your message, they have to give consent, too. There are bad ways of doing that, like an endless stream of dialogue boxes you have to click: “Yes, I agree. Yes, I agree. Yes, I agree.” And then there’s good ways of doing it, where you can use Facebook as a consent management platform. They already get a lot of consent from you to do a lot of stuff, they could get some more as well. So the privacy stuff is good, and includes some definitions about what the new platforms can’t do with your data, certain kinds of commercialization is off the table.
But it’s not perfect, because it doesn’t have what’s called the private right of action. That just means that if the rules are broken, you don’t get to decide whether or not to go after the company for breaking the rules; you have to convince the attorney general to do it. There’s a lot of people who don’t like private rights of action—oftentimes, people who think of themselves as politically conservative don’t like them, because they worry about frivolous lawsuits against big business. But if you want justice to be widely available, you can’t make it only available to people that the attorney general thinks matters.
In other respects, it’s exceptionally well-thought-through. It has really big penalties for failing to comply. There’s this old saying that a fine is a price, so if you can make $1 billion by cheating and then only get a $50 million fine, why wouldn’t you? And the fines in this case, it’s hard to imagine the company shrugging them off—even really big companies. So that’s pretty cool.
But it doesn’t have a privacy law of its own—and it doesn’t rest on another privacy law. It tries to describe what privacy is on its own, and it has like this thing where it’s like, “You can’t do bad things with data.” And then further on it says, “Within 180 days of the passage of this law, we will decide what data means.” Which, you know, it’s a hard problem—but at the same time, that’s a heck of a can to kick. I favor the bill and I favor its passage, but I also think that when it passes, we’re going to have to keep a pretty close eye on how they define data. We wouldn’t have to, if we just did the sensible thing and had a privacy law, which we really should.
KW: Given that the companies have effective, often aggressive lobby shops around this, what kind of chances do you see for the ACCESS Act?
CD: Well, it’s this rare moment where there’s some bipartisan support. I mean, I think in both cases, both from the Democrats and the Republicans, some of the support is very principled, but a lot of it is pretty instrumental. I think there’s a lot of people calling for antitrust action on Big Tech on the Republican side who, if they just changed their moderation policies, would be like, “Nah, it’s okay.” And I think there are a lot of Dems who, if the companies would just pay their taxes, would probably be like, “Yeah, fine.”
But I also think that there’s a core on both sides who really, really, really understand the underlying issues, who understand that technological self-determination is too important to allow this kind of parochial market dominance. That we really need to approach this with a very robust, old-fashioned way of thinking about monopoly.
We need to really move back to that harmful-dominant standard that started with the trust-busting era, where we concerned ourselves not just with prices, but with all of the ways that allowing a lot of power to be gathered into a small number of unaccountable hands distorts our democracy. Not just our markets, but our very democracy.
KW: Well, Cory, thanks so much for talking all of this through. I really appreciate your time.
CD: Oh, it was my pleasure as well. It was lovely to chat with you.