In praise of algorithms and echo chambers | David S. Oderberg

Algorithms have come in for a lot of stick in the last decade or so. They have become a permanent part of the digital landscape, shaping how we interact in and with the online world. Algorithms used to be part of the arcane realms of mathematics and logic: now they curate our online lives, serving up the cat videos and scare pranks we know and love, and keeping us from all the things we swipe past in a nanosecond. Well, that’s their ostensible purpose, but we know that they are also used to deliver propaganda, sly marketing, not to mention a lot of revolting material most sensible people try to avoid.

The power of algorithms is said to produce “echo chambers”, where people have their tastes and opinions reinforced, leading to yet more of the same material being served up by the algo. This allegedly leads to a stultifying lack of exposure to new and challenging ideas. We are, so the criticism goes, ending up in “silos” — stuck in our own self-created circles of reinforced prejudice, not growing intellectually or emotionally, increasingly closed to the famous “marketplace of ideas” that is often claimed to be the precondition of societal progress.

Hence the seeming paradox: how is it that the exponentially increased connectivity the internet has brought us over the last thirty years has led to an increasingly reduced breadth of connection and to greater digital “ghettoisation’” How could such an apparently perverse outcome happen, thereby defeating the very promise of the internet itself? My suggestion is that there is no paradox. The outcome is the opposite of perverse: it is a pure expression of our natural agency and affinity, one that would have manifested itself at any time in human history if there had been an internet. This natural expression just happens to be at its most visible right now. Allow me to explain.

The algorithms ubiquitously embedded in our digital lives are not the cause of echo chambers and silos but the mere occasion of them. A cause produces something that was never there in the first place. An occasion simply makes possible, facilitates, or encourages what was already there but latent or dormant, and difficult to actualise. An internet algorithm makes possible what we were already inclined to do before but could not due to limitations on connectivity — find our “in-group”, team up with those who share our tastes, our opinions, sense of humour, political allegiances, and so on. It doesn’t cause echo chambers any more than the printing press created gossip. It’s like finding your political party or sports club: you join because you share the outlook or the interest. 

Of course, you might only be exploring: maybe badminton is for you but you’ve never tried it. Fine, algorithms do not prevent you from exploring the internet. (Government and tech monopoly clampdowns and censorship are a different problem.) But if you like what you are being served online, you will keep being served the same and similar. Your outlook and tendencies were already there before you were served, because that’s how the algorithms work. Once you lose interest, or if you never had it, the algo switches and instead of cat videos you’ll get dog videos, or extreme bungee jumping, or whatever. I for one love cat videos and am served them endlessly, which is just fine by me. If I lose interest — as I do from time to time in one thing or another — the algo discovers this quickly (e.g. via watch time) and switches the menu. The whole idea is for you not to be bored, this being the least desirable outcome for the online providers such as TikTok and YouTube: bored people switch off altogether.

Instead of “echo chambers”, why don’t we call them “online affinity groups”? Sounds a lot more cuddly, right?

I am not denying that your views and tastes are shaped and reinforced by the echo chamber in which you end up unless you deliberately break out of it. But it’s just an echo — not the original sound. Instead of “echo chambers”, why don’t we call them “online affinity groups”? Sounds a lot more cuddly, right? And it is in line with the classical liberal notion, deriving from thinkers such as John Stuart Mill as much as the marketplace of ideas concept, that people, as essentially social creatures, seek out the company of the like-minded. We have done so throughout history. Algorithms simply accelerate discovery. Instead of the menacing term “digital ghetto”, why not speak of an “online polis”, flavoured by Aristotelian thinking? Forums, chatrooms, and channels … these are little (well, often millions-strong!) organisations of people, with rules and etiquette (rather “netiquette”). They make a space for sharing passions and preferences — and for keeping out the riff-raff, i.e. those who lack the same opinions and enthusiasms. 

Again, filtering what we are exposed to is hardly a human novelty. The author and activist Eli Pariser coined the term “filter bubble” as a virtual synonym for “echo chamber”, criticising the intellectual isolation such bubbles supposedly produce. Yet one might as well object to my friendship circle on the ground that it “isolates” me from the wide world of interesting people out there, or for that matter my bookshelves: I mean, why those books and not others? Well, I chose them, and it is not an exaggeration to say that I have, in some mysterious neurological way, created my own internal algorithm that tends to lead me to the non-fiction section on Amazon, or to the piano scores in my local Oxfam. I filter out all the rest because I choose to do so. Yet my internal algorithm has switched over the years, as I have developed new tastes and lost old ones. Funny how that works.

I happen to like a certain make of motor car. (I won’t say which, but just to emphasise: I have no conflict of interest in writing this article!) I’ve driven it for decades and plan to continue. As long as it’s not overdone, I welcome being served emails from the manufacturer and dealers about that specific model. I know there are others, and I sometimes receive messages about them, which end up in spam after the briefest of skims. It’s not as though I am ignorant of the other makes of car available. Nor am I unaware of many things online in which I have no interest. If that changes, so will my online activity, and you can wager your house that the algos will follow in lockstep.

The primary job of an algorithm is — or perhaps should be — to play to one’s own opinions, wants, tastes, and inclinations

So what are the critics getting wrong when they speak of “radicalisation” by algos and echo chambers, or of a “false consensus” due to misperception of public support for views reinforced in a silo, or yet of “misinformation? I propose that they are conflating different issues. There are laws to prohibit and punish conspiracies to commit crime: the post 9/11 age has showered our legislation with more than enough tools for law enforcement to deal with that. This is a distinct problem from the existence of digital affinity groups — not about whether they should exist, but about whether and how law enforcement should ever infiltrate them for the common good. Again, it is true that your affinity group might give you the impression your shared opinions are more popular than they are. So what? Whose business is it if someone else has an inflated view of their own veracity? If you want to counter such delusions, you are free to contribute to a larger affinity group.

More importantly, the primary job of an algorithm is — or perhaps should be — to play to one’s own opinions, wants, tastes, and inclinations. But they can also play to someone else’s opinions, wants, and so on. Whether it be tech companies, marketers, government, intelligence agencies, multinationals, scammers — algorithms can be used, knowingly or unknowingly, to feed you things you never asked for, even implicitly.

It’s a bit like going to your favourite restaurant, poring happily over the menu, only for the waiter, or the diner at the next table, or the owner, to insert some item you never asked for because they think you should have it. Or for your next-door neighbour to let themselves into your living room and take over the remote control, just as you were happily scrolling through the evening’s entertainment. “Now here’s a movie you’ll really like.” Maybe, maybe not. But they should still get lost.

For algos to be deviated from their original and proper purpose is to blur the boundary between free association and compelled listening. Is what I am being served the vector product of my beliefs and desires, or of someone else’s? YouTube, for one, has a policy of labelling government-issued information, so at least you know it’s not an echo to which you are listening. Which is not, of course, to say there is no role for government in using the digital world to promote its agenda, or any other organisation or individual. The tech companies, however, have an obligation to keep on top of their algorithms and make clear wherever possible that the user is being served something they never ordered.

This is particularly pressing now that we learn, at the time of writing, that the Government broadcast regulator Ofcom is urging platforms such as YouTube to push the “discoverability” of “public service broadcasting”, which in the UK is virtually identical to BBC content. They even suggest that legislation might be necessary. In other words, they explicitly call on YouTube and other platforms such as TikTok to modify their algorithms so as to penetrate the “echo chambers”, or rather “voluntary affinity groups”, of users. Short of a literal national emergency, I cannot see any good case for such interference in voluntary associations and personal spaces. There are plenty of ways for governments to get their messages across without such meddling, and if they do so meddle the platforms must ensure that all such messaging is explicitly labelled. Again, I am not talking about legitimate law enforcement, such as tracking criminals; I am talking about inserting content into spaces where it was not asked for, spaces inhabited by law-abiding netizens.

Despite the risks, then, I really quite like my algos and affinity groups. They know (roughly) what interests me and they serve me the desired content with decent accuracy and regularity. They know when I’m bored, when my tastes have changed, and even whether I just might like something a little different. In these ways, algos are a bit like your mum’s cooking. If she’s good at it, what’s the problem?

Source link

Related Posts

Load More Posts Loading...No More Posts.