Web of Lies, Web of Hate

Share

How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually (New York)
The Internet Became Less Free in 2018. Can We Fight Back? (Wired)
Lawsuit Alleges Facebook Duped Advertisers, Publishers With Inflated Video Stats (Variety)

My YouTube suggestions have gradually shifted these past few months to being nothing but longform music mixes, mostly ambient and downtempo, but with the occasional synthwave set thrown in. This is probably due to me playing said mixes in my office while I’m working; I’m still getting used to having my own office (which I never had when I was doing the adjunct thing, camping in whatever quiet corner of the faculty lounge I could make for myself), and as much as I love picking and choosing individual songs and albums to play so I can really enjoy the music I’m listening to, that’s not so easy to do in my office. I’m unable to play music all that loudly when I’m there to start with, and I’m usually super-busy on top of that, so it’s become easier for me to just load a YouTube mix onto my tablet and leave it playing at a low volume while I take care of my business. When I get back to my apartment after work, that’s when I can indulge in great music coming from great speakers, like I’m doing now while I write this blog.

After I got my new television set up a couple of weeks ago, I wanted to test the built-in YouTube app to see how it worked, and also gauge the quality of the television’s speakers (good but not great). I just wanted to do something quickly, though, so instead of signing into my Google account through that YouTube app right away, I did a quick search in the app and played one of my favourite videos: Marvin Gaye singing “What’s Going On” and “What’s Happening Brother” live in concert, with video clips of African-American life in inner-city Chicago in the early 1970’s spliced in. On top of the extraordinary music, that video is such a fascinating and moving time capsule of a time just before I was born that it still hooks me every time I watch it. (Needless to say, I highly recommend it to all of you.)

I normally have autoplay turned off when I’m on YouTube, but since I hadn’t configured that app on my television, it waited a few seconds before launching right into the next video it thought I should watch after that Marvin Gaye video. I’m not going to dignify the video’s creator by mentioning their name, but suffice it to say that it was one of those ultra-right-wing hate mongers whose speech might have been considered fringe even by many conservatives’ standards a few years ago, but has now become mainstreamed in this accursed political era. The video showed the creator giving a speech and, in the creator’s own words, “owning” an audience member who pointed out the creator’s disgusting prejudices. (The “own,” as is so often the case in these videos, was the speaker basically saying, “You can’t stop me from being an asshole, so I ‘win,’ neener neener,” mixed in with the raucousness of the speaker’s supporters screaming slurs and other obscenities.)

Right-wing firebrands creating their own YouTube channels is hardly a new phenomenon, but the “[speaker] owns [slur]” video format has become increasingly popular in recent months. Their appeal isn’t hard to figure out, especially for conservatives who have been inculcated to crave the hateful anti-intellectualism that masquerades as “discourse” on the far right. These videos are the embodiment of the phrase “hate is a powerful drug,” because they’re deliberately engineered to give the creators’ followers a huge dopamine hit, probably on par with the most potent opiates on the street right now.

Those videos raise too many questions to be addressed in a short blog like this, but in light of the way I was first exposed to the particular video mentioned above, one question kind of has to be answered: What do the themes of that video have to do with Marvin Gaye singing about the futility of war, so as to lead YouTube to think I’d want to watch the former after seeing the latter? The answer, of course, is dick-all, and that just shows how fundamentally broken YouTube’s recommendation algorithms are, and how YouTube has failed to take care of this glaring problem.

As James Bridle has documented over the past couple of years, the way that YouTube’s algorithms are being gamed in the world of children’s videos is nothing short of ghastly, and the problems being created by these abominations extend far beyond children given a tablet running YouTube on autoplay by their almost-completely-absent parents. The same problems can, and quite clearly are, being replicated for videos for adults, people who can do far more damage to others after watching a stream of these videos than a four-year-old can, especially when those right-wing videos have about as much connection to reality as the absurdist sketches that Bridle has brought to so many people’s attention.

I frequently have to watch right-wing videos of all stripes as part of my professional research, and I always make sure to do said watching on an incognito tab on Chrome, so as to avoid polluting my Google account’s search history with that dreck. Even when I’m watching rational right-wing discourse, though, the videos YouTube recommends for me always delve into the fantastical, from mindless conspiracy theories to fiery expressions of hate. At least there’s a little sense to that: YouTube noticed I was watching a video of conservatives, and thought that I might like to watch other videos by conservatives. (There is something to be said about Google, YouTube’s owner and the Grand Poobah of Internet companies, being able to tell the difference between videos with and without at least a scintilla of redeeming qualities, but since Google still can’t give me a consistent answer as to whether or not they think my apartment actually exists, maybe I’m asking for too much.)

There is no possible logical connection between a Marvin Gaye music video and a video of a right-wing hatemonger perverting the very definition of discourse. The only rational reason for me getting that recommendation a couple of weeks ago is that far-right conservatives, or at least this particular one, have so successfully gamed YouTube’s algorithms that their videos are now being recommended to literally everyone whose YouTube history makes it clear that they aren’t interested in that sort of thing (like my main account). I haven’t seen any evidence yet that these videos are now being recommended to school-age children who are even more susceptible to being indoctrinated by these hatemongers, but it wouldn’t surprise me if I soon saw evidence of that, and I sincerely doubt that I’d be the only person to react that way.

It’s no secret that right-wingers are trying to take advantage of platforms that are easy to manipulate through the brute force of bots, “raids” and other such techniques; even if the mechanics of social media platforms that conservatives favour weren’t enough evidence of that, some of them have even made their own YouTube videos describing their strategies and techniques. They’re very open about what they’re doing, which makes YouTube’s insufficient response to this problem all the more galling. Especially as news stories about Internet fakery become more and more legion, every Internet company should have a deep interest in rooting out all the tools, and the people behind them, that are being used to effect these manipulations.

If these companies are only concerned about making money, though, then maybe these aren’t really problems to them. After all, if the only thing they’re interested in is advertising revenue, then whether or not their ads are being watched by actual people is kind of beside the point. There’s a school of thought which states that the advertisers themselves should be able to exert pressure on these platforms to weed out the bots and other similar machinations, but if the problems of the past few years haven’t spurred advertisers to exert sufficient pressure on these platforms already, then it’s difficult to see a point at which that pressure will eventually materialize.

Maybe the users of these platforms need to band together and force the platforms to change. Maybe the federal government needs to step in and force the companies behind these platforms to expend a reasonable amount of effort to eradicate these problems. Maybe we just need to chalk all this up as yet another glaring example of how American capitalism is failing the American people. Whatever happens, it can’t happen soon enough. We know what’s going on. Now we need to do something about it.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.