I was reminded by various things in the news recently (I'm still trying to parse that last link, seriously, Google is responding to what by doing what?) of the time last autumn when, for job-related reasons I briefly needed to reference some news stories about Child Sexual Abuse (CSA) on Youtube for a training presentation. I also needed at the same time to access some video promotions for CSA perpetrator programmes, access some internet safety and grooming public information films by CEOP (UK Police) and some new (and old) films from the NSPCC. All job-related, fully factual content.
I noticed the problem when I then embedded an (unrelated) video in a blog post and watched the content through. Among the usual run of cat videos, 80s electronica and garden walk-throughs there was a single really odd-looking recommendation. I clicked it to check it (as one does) and discovered a truly deeply horrific fake news paedophilia/celebrity conspiracy "news" video. I made an abortive moment towards reporting the content, and then froze, remembering the not guilty verdict on the Girls Aloud torture porn trial. There was a good chance this bilge counted as free speech, fiction, comedy, entertainment or all of the above.
So I clicked away again and instead went to disable the related videos on the embed link, reminded once more that with the variability of content on Youtube nowadays you really kind of need to do this on personal content as well as professional, or you'll end up exposing people to who knows what. But the worst was yet to come.
I got back to Youtube and discovered that my "up next" videos had, to put it mildly, changed. My recommendations were now very one note indeed, and all variations on the same sort of video. Some politicians, some celebrities, but all of it was CSA-related, all clearly fake, all news/expose/revealed shock-jock style ranting, and all uniformly the sort of horrible fake news crap that I was disappointed existed, and certainly didn't want to have spontaneously served into my browser.
I started to google for fixes. I found the methodology quite quickly, but the process itself was slow.
The fix, fact-fans, is to go and adjust your recommended videos manually, i.e. for every single video click on "Not interested". This will remove the video, and eventually, all videos like it. This takes a strong stomach, and a lot of clicking, but in the end I fully resealed my bubble against the fake-news sewage tide. Which is great for my blood pressure, but this terrible, horrible, afactual... no, counterfactual content still exists, hovering, ready to jump in the moment someone shows a change in their viewing preferences. And while I did it as part of a research run, in a well-balanced and rational state of mind, how might it different if I'd triggered something similar inadvertently in a moment of doubt, confusion, unhappiness or vulnerability? What might be the long-term effects if I lacked the wherewithal to hack back the tide of misinformation?
After all, as I discovered, it doesn't have to be a big change at all. Just a few clicks, and it comes rushing in, a foul tide of information pollution; as damaging to mental health as the real thing is to physical health.
I noticed the problem when I then embedded an (unrelated) video in a blog post and watched the content through. Among the usual run of cat videos, 80s electronica and garden walk-throughs there was a single really odd-looking recommendation. I clicked it to check it (as one does) and discovered a truly deeply horrific fake news paedophilia/celebrity conspiracy "news" video. I made an abortive moment towards reporting the content, and then froze, remembering the not guilty verdict on the Girls Aloud torture porn trial. There was a good chance this bilge counted as free speech, fiction, comedy, entertainment or all of the above.
So I clicked away again and instead went to disable the related videos on the embed link, reminded once more that with the variability of content on Youtube nowadays you really kind of need to do this on personal content as well as professional, or you'll end up exposing people to who knows what. But the worst was yet to come.
I got back to Youtube and discovered that my "up next" videos had, to put it mildly, changed. My recommendations were now very one note indeed, and all variations on the same sort of video. Some politicians, some celebrities, but all of it was CSA-related, all clearly fake, all news/expose/revealed shock-jock style ranting, and all uniformly the sort of horrible fake news crap that I was disappointed existed, and certainly didn't want to have spontaneously served into my browser.
I started to google for fixes. I found the methodology quite quickly, but the process itself was slow.
The fix, fact-fans, is to go and adjust your recommended videos manually, i.e. for every single video click on "Not interested". This will remove the video, and eventually, all videos like it. This takes a strong stomach, and a lot of clicking, but in the end I fully resealed my bubble against the fake-news sewage tide. Which is great for my blood pressure, but this terrible, horrible, afactual... no, counterfactual content still exists, hovering, ready to jump in the moment someone shows a change in their viewing preferences. And while I did it as part of a research run, in a well-balanced and rational state of mind, how might it different if I'd triggered something similar inadvertently in a moment of doubt, confusion, unhappiness or vulnerability? What might be the long-term effects if I lacked the wherewithal to hack back the tide of misinformation?
After all, as I discovered, it doesn't have to be a big change at all. Just a few clicks, and it comes rushing in, a foul tide of information pollution; as damaging to mental health as the real thing is to physical health.
No comments:
Post a Comment