In our last piece we provided a background on enshittification, reviewed some proposed recommendations, found them lacking, and decided to plot our own course to a solution based on the principle of promoting interoperability.
In this piece we’ll outline the goals and tenets of our approach, dive into the individual use cases for interoperability, and begin to sketch the outlines of our solution.
Goals
Note: throughout the following sections I use the term “Clients” to refer to 3rd parties who build software and products on top of a “Platform’s” data.
Drive Innovation and Quality: Incentivize platform incumbents to continue to improve the quality of the products they offer and not rest on their laurels or built-in advantages. Our changes should provide real meat in the game for incumbents to continue to pursue innovation and compete to optimize their products for the benefits of consumers, creators, and suppliers.
Improve the Competitive Landscape: Open a space for upstart clients to coexist and thrive alongside entrenched platform incumbents at scale. Provide concrete mechanisms that create a more diverse ecosystem with more participating service and software providers.
Improve User Experience: Consumers of these platforms should reap the benefit of improved interoperability. Whether this comes in the form of innovation within a platform, cost reduction, or through aggregation across platforms.
Reduce Platform Dominance in Two-sided Markets: Shearing the golden ram, we don’t want to destroy these platforms or eliminate the benefits they provide but we do want to make sure they are not destroying their producer ecosystem: sellers, advertisers, news organizations, software developers, content creators, and others— all these groups must remain incentivized and remunerated to ensure a healthy ecosystem.
Support Eventual Interoperability: We don’t want to eliminate endogenous platform innovation but we do want to support eventual interoperability that allows integrations, aggregations, and workflows that were not previously possible.
Provide Insights Through Anonymized Aggregation: There are legitimate reasons in commerce, civil society, and academia to analyze aggregated, anonymized data. This data should be available to different stakeholders with an appropriate level of aggregation for their use cases and proper techniques used protect individual users’ privacy.
Be Pragmatic, Foster Open Societies: As always, our goal is to find solutions that are palatable and balance the concerns of existing vested interests and emergent concerns while fulfilling our ultimate goal of fostering open societies.
Use cases
Let us quickly separate out our two major use cases as they are very different beasts.
Individual Access: I, as an individual user of a service, want to control how and where I access platform data and what systems and services I allow to do so on my behalf.
Aggregator Access: As a corporate or non-profit entity, I want to provide a service or product built on aggregate data from a platform.
These two use cases while both important to our ultimate goal follow very different paths and have very different requirements and we do not want to muddy the waters by tackling them together.
Core Tenet
Underlying both types of use cases is a fundamental tenet. Call it an API Mandate for Platforms. For those unfamiliar, Jeff Bezos when head of Amazon famously published what came to become known as the “Bezos API Mandate”.
All teams will henceforth expose their data and functionality through service interfaces.
Teams must communicate with each other through these interfaces.
There will be no other form of interprocess communication allowed: no direct linking, no direct reads of another team's data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network.
It doesn't matter what technology they use. HTTP, Corba, Pubsub, custom protocols -- doesn't matter. Bezos doesn't care.
All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions.
Anyone who doesn't do this will be fired.
Thank you; have a nice day!
Ha, ha! You 150-odd ex-Amazon folks here will of course realize immediately that #7 was a little joke I threw in, because Bezos most definitely does not give a shit about your day.
Jeff Bezos, paraphrased by Steve Yegge, in an absolutely fantastic rant
The end goal of this work will be to define our API Mandate for Platforms. But first, let’s dive into our use cases. We will focus first on Individual Access use case.
Use Case: Individual Access
Interoperability
A quick, historical aside. I came of age, and am showing it, in a rare moment of social interoperability. Trillian, allowed aggregating all of my Instant Messaging (IM) and email experiences into a single application. It was a fantastic experience, even in this inchoate social landscape, to be able to have conversations centralized across multiple platforms and tools, and use them all in a single application. Even if this meant sometimes it was necessary to switch back to a dedicated application for a particular task. E.g. certain interactions could be only be done in AOL Instant Messenger (AIM) interface because the functionality was not supported in the common Trillian interface.
This is an experience of interoperability worth striving to regain and one of the core functionalities we want to provide. Whether we are talking about FaceBook, Instagram, Twitter, TikTok, Reddit, etc. I should be able to log in (OAuth) with these platforms to determine my identity, or alternately generate an API token, and be able to perform all the activities that the platform supports for users natively— read, post, like, direct message (DM), retrieve friends and followers etc.
Pretty straightforward but with this alone, we’ve already unlocked a lot of value for users and opportunities for new business. Let’s highlight the user benefit.
Feed Aggregations: I can view content from multiple feeds combined in a single container. E.g. I can doomscroll an omni-feed that incorporates all of my favorite time sucks in one place.
Publishing Automation: I can bulk publish to whatever platforms I desire. E.g. I want to publish the same material to Twitter, Bluesky, Mastodon, and FaceBook adapting it to each platform. Note: I know some tools already exist to do this, but they generally have serious limitations and rely on workarounds and a degree of platform goodwill rather than being enshrined as a core functionality.
Interaction Automation: I can interact with content across multiple platforms. I really want to support my mother’s online presence but I don’t want to have to like her on Twitter, heart her on Instagram, Laugh on FaceBook, and Upvote on Reddit. Or maybe, I just want to like whatever she posts content unseen.
Contribution Retrieval and Data Portability: I can retrieve my contributions to the platform. Who I friended, content I posted, images I uploaded, Reels I made. Whatever I put into the platform, I can get out and convert to common interchange formats.
There are probably other novel and unique benefits that I have neglected to mention or have not been developed that requiring platforms to provide APIs would unlock
We’ve already talked a bit about ActivityPub protocol and how common protocols represent a floor rather than a ceiling. But, it’s important to highlight how the interoperability described here is fundamentally different.
Under the model of a common protocol each social provider would adhere to a baseline common protocol. The version of interoperability we present here (in line with the Bezos API manifesto) is fundamentally different. The platforms must expose an API for all of their user-facing functionality and then clients can stitch together these disparate offerings into the best model for a given consumer group or use case. The end result is more flexibility and functionality for all parties. Each platform can forge ahead developing it’s own functionality and clients can work out the best methods of unifying and aggregating that data through the exposed APIs which again must expose all native functionality available to the users via the platform UI.
Model Substitution
Our next individual access proposition ups the ante considerably. I call this “model substitution.” In the past decade, we’ve witnessed the move from “Sorted chronologically” to “Sorted algorithmically” across social platforms. I propose we move to a future of “Sorted by the algorithm of my choosing”.
Every feed algorithm model can be conceived as a black box that takes an input (location, likes follows, followers) and returns an output (posts, ads). There is no requirement or ask here that platforms reveal their secret sauce, how FaceBook chooses to order your feed to optimize attention and ad spend is their choice, but consumers should have a choice to opt for a different model and plug the inputs available to the platform into that model.
A client can design, build, run, and deploy their own model that takes the same inputs, and potentially additional inputs available, and provides outputs in the same format as the platform’s default feed algorithm model.
The platform would bear none of the cost of running or training the client model, and the client might need to pay a minimal cost for data access to the data set. But the platform must allow consumers to use the client model in place of their own if they choose to do so.
There’s lots interesting, emergent use cases that can be derived from this proposition. Let’s start with user choice.
I can choose to provide more or different information to change my experience. I can provide multiple platform identities to allow novel aggregations of content. E.g. I could dedupe results across Threads, Twitter, Mastodon so I only view them once; I could aggregate my feeds into a super feed of all of them; I could provide personal information to allow for more targeted ad experiences.
This just scratches the surface of the possible. And, I’d be remiss if I didn’t mention AI here which could unlock the opportunity to provide custom user instructions to an agent that aggregates and sorts your feed based on criteria you define. There is an attainable future here where you, the user, explain what you want to see and how you want to see it and let an agent go through all of your feeds and present it based on preferences you provide.
There’s also an interesting UI layer that comes from aggregation. If I am a subscriber to some other content that is linked, say a LA Times article on Twitter, I could devise a richer experience to view that article, a benefit to me as a consumer if not the platform who doesn’t want me to leave. I could also integrate my favorite apps for instance storing posts or metrics in Notion or integrating dashboards to view my post’s performance and interactions across platforms.
Next Time
In this piece we focused on the benefits to consumers of our proposed model. In Part 3, we will flip the script and work through how these changes will impact platforms and what revenue models will compete with, but not extinguish, the status quo.