Simon David Dressler
Under supervision – algorithmic control and self-censorship

Softcover 14,00 €
In social media, it is no longer a traditional censor who decides what becomes visible. Instead, algorithmic systems determine visibility, with their logic remaining a black box for users, and political content is often downgraded—as a result, creators adapt on their own initiative. This “censorship” happens in the mind: words are replaced, topics are rephrased, tones are softened—a self-restriction in response to opaque rules and the ever-present possibility of sanctions by the platform. This dynamic, which Simon David Dressler describes precisely, is exemplary of the digital age: algorithmic control that produces self-censorship.
Sample reading
Simon David Dressler: There’s also a curious phenomenon called “algospeak,” where certain words are replaced with others because influencers believe this helps them get around algorithmic blocks.
This happens in subtitles too. For every video, you can have subtitles generated automatically and then edit them.
And if, for example, you say something with the word sex—anything sexual—it’s become common not to write S-E-X in the subtitles, but S-E-G-G-S, because people believe this will bypass the automatic content detection.
When I make videos about Gaza, for example, I eventually got used to saying “dodgeball” instead of “genocide”—at least in the subtitles. Not because I know it actually works, but because I suspect that the automatic content detection for critical or sensitive topics won’t be triggered, which is probably silly, since these algorithms and AI-powered content detection systems are so advanced that you probably can’t outsmart them this way.
Further interviews with Simon David Dressler: