I watched some YouTube videos about white poverty in Northern England. Now I'm being suggested videos about the "white genocide"...


A few weeks ago I started watching some YouTube documentaries about poverty in white working-class communities in Northern England. I found them interesting so I kept watched more. Then I started seeing some suggestions which linked this poverty to anti-establishment sentiment and the Brexit vote. Interesting. I watched more. After a while I started being suggested documentaries about the rise of the far-right across Europe. Great, I find this kind of stuff interesting so I watched some of them too.Then something weird happened... I'm interested in documentaries but suddenly I was being recommended videos promoting far-right ideas. Some about mass immigration and how it's tearing society apart. Some about the globalist conspiracy to exploit the white working class. Some promoting hard right politicians like Le Pen and Putin, and calling their opponents globalist traitors.This is really a big problem. I've read before about how YouTube's algorithms tend to promote increasingly extreme content over time, but I had no idea it could happen so quickly and so unexpectedly. I'm a fully grown adult with firmly established political beliefs, so I'm not so much at risk. But I can easily see how this kind of thing could lead to the radicalisation of young, impressionable viewers.I'm not at all suggesting that this is in any way deliberate on the part of YouTube, but I'm interested to know if anyone is aware of how they're dealing with it. I guess the algorithms just promote what they think the viewer wants based on what other people have watched, which is fair, but do YouTube need to take on more responsibility in curating content - or does that go against the ethos of user-generated content? via /r/technology https://ift.tt/2B23x9t

Popular Posts