Your algorithm is performative
Despite the promises your algorithm doesn’t know you. But don’t think about it too hard because it will break your brain.
I just saw this post on Bluesky and Rob’s thread on the consequences of a sharable feed is very good, you should read it.
being able to share your algorithmic feed with others seems to extend the logic that "the algorithm knows the real you," but it really makes refining your algorithm something you can more effectively perform for others www.businessinsider.com/instagram-re...
— Rob Horning (@robhorning.bsky.social) 2025-04-23T14:25:08.472Z
But it also made me think about something I have been feeling for a while. The algorithm already is performative. Or to be more precise: Our interactions with the algorithm are. At least if you are aware of its existence while using it. And it’s breaking my brain.
To not overcomplicate things imagine an app like TikTok but without predatory tactics, possible foreign influence, and interests that align perfectly with yours.
Imagine the algorithm as a screaming Roomba
Every so often I scroll through my feed and I see something that the algorithm usually doesn’t show me but I find really interesting. I will always make sure to finish the video, maybe even leave a like. Great. That isn’t noteworthy, that’s how it’s supposed to work.
But I’ll even do this when I don’t want to see the video right now. Maybe it’s an upbeat explainer when I’m tired and just want cute dog videos. The intended behavior is for me to swipe. But unfortunately I’m unable to ignore the existence of the algorithm and I’m afraid that it’ll never show me this kind of explainer again. But I want it to! Just not when I’m tired.
The same goes in the other direction. Sometimes the algorithm shows me something I’m not generally interested in, but for whatever reason this is an exception. A recent example would be a video about a gun. I don’t care about guns. But this particular one fascinated me because it was in a video game I like. It took so long for the algorithm to realize that this was a fluke and that I’m not engaging with any other gun content.
Which leads me to often dismiss videos that I actually want to see. I can’t have them ruin my algorithm. That’s an insane thing to think about. The algorithm is supposed to be something that works for me. But it’s not a sentient being that knows me, it’s like my Roomba that forces me to make sure that there aren’t socks on the floor before it starts cleaning or it will eat them and won’t stop screaming until I come to its rescue.
The algorithm just makes you feel like it knows you
I get that we ask impossible things of the algorithm and that it’s a miracle that it works as good as it does. It can’t know my mood and how is it supposed to deal with the conflicting desires inside of a human mind. »Do you want to watch this thirst trap or not?« is just the equivalent of a bakery asking you if you want this cake. How much you indulge is kind of up to you.
Thirst traps are actually the perfect example because a: it’s funny as fuck when people complain about only being shown scantily clad women without realizing the self-own and b: while these men deserve to be laughed at, it’s also a fact that the algorithm does share some blame. Because to make it all work, the platforms will pigeonhole people and the categories seem to be very broad. Unfortunately I’m now forced to talk about my own algorithm and how it struggles to figure me out. Wish me luck.
The basics of the algorithm are pretty self-explanatory. You like X. People who like X often like Y. So you’ll probably like Y as well. Easy. But humans contain multitudes and if you engage with more than one type of content it’s gonna get weird, even if it’s pretty obvious what’s happening. I watch lots of videos about leftist theory and social justice. It won’t be a surprise that they are often made by queer people. But I also watch nerd content that (while also being something many queer people like) is often from straight men. The algorithm also seems to be pretty sure that I’m attracted to women.
The result is the most entertaining kind of confusion when it comes to content about sexuality. And I don’t mean thirst traps. Because even after years on the app it can’t figure me out. It will try content meant for lesbians, sleazy incel videos and anything in between. Because the kind of videos I watch seem to make it impossible for the algorithm to figure out this specific aspect of me.
Let me help
To be clear: It would be trivial to fix this confusion by engaging with the actually relevant content. I just have way more fun not doing that. But it also shows a failing promise. Because yes, algorithms can be very powerful and show you many relevant things. But they could be a lot better if they allowed us to help. And I don’t mean for people to be conscious about how everything probably works and then sending vague inputs that might or might not mean what they think they do. (Does liking really do something? If so how much. Do I need to watch to the end or just most of it?)
A truly useful solution would mean handing over some of the dials. It doesn’t even have to be detailed. Even letting us choose between engaging content, comfy content and a feed of new experiences would help. Instead we get FOMO and scared of messing up our recommendations.
Unfortunately the reason for this can’t be explained with our hypothetical app with perfectly aligned incentives. That isn’t how reality works though. The algorithms incentive is to keep us inside of the app by any means necessary. And that requires more than just showing us what we want right now. So the only way to play is to perform for the algorithm. Which makes no sense. But neither does living in a system that leads to our destruction.