… everything falls apart.
Specifically, it appears another “Adpocalypse” is about to hit YouTube. Last time, it was ads appearing on objectionable videos, such as White Power or Daesh recruitment videos (both of which aren’t supposed to be able to be monetized in the first place). A bunch of advertisers pulled their ad buys from YouTube, which meant creators on that platform, large and small, saw their income from YouTube dry up to damn near nothing. But after much trial and error (heavy on the errors), YouTube was eventually able to win back those advertisers through adapting their algorithms and AI, so that it would demonetize such objectionable content on the platform.
Well, this time we have child porn to blame for the advertisers fleeing in droves. Seriously.
I have to say, I can’t blame the advertisers for running away this time, I mean, who wants their brand associated with such filth? But my question is this: If YouTube could train their AI and whatnot to find objectionable content like Daesh recruitment videos, why couldn’t they train it to find child porn, which I would argue is a bit worse in the grand scheme of things? Seriously, why YouTube?
So I expect the ads on my videos (available on Gaming Stuff) will be in short supply for the next few months, just when they were starting to produce results. Now then, “results” in this case means I was starting to make a bit of pocket change every day, as opposed to a penny or two a week. I don’t get a lot of views, you see, but enough to drive a bit of cash here or there. Now that will dry up to nothing, I expect, probably until the Spring or later, if last Summer’s issue is any indication.
Okay, so I don’t exactly go searching for porn on YouTube, let alone child porn, because one would expect such things (especially the latter) would be… oh, I don’t know… nonexistent there? Because you would think the first thing YouTube would try to block from their platform would be illegal content, especially something as horrific as child porn, wouldn’t you? And you would think that YouTube would take their monetization program so seriously as to block any videos including illegal content from showing ads, even if just pictures uploaded in the comments, right? Again, especially content that include children being sexually abused?
But no. YouTube instead has spent the Summer and much of the Autumn demonetizing the videos of right-wing commentators (many of whom, deserved it), firearm aficionados (many of whom did not deserve it), and gaming channels (none of which deserved it). Meanwhile, there was apparently even worse content to be found on the platform, which should have never made it past the filters, let alone managed to get monetized!
I can hear the YT Fanboys now, though: “There are hundreds of hours of content uploaded every minute! How can YouTube be expected to find and stop every objectionable video or comment?”
Well, to that I say this: YouTube’s algorithms can find copyrighted content in an uploaded video and take that video down in minutes. YouTube’s pack of trusted Flaggers can find objectionable content on right-wing commentators channels within hours. And their “Machine Learning” AI can demonetize videos as soon as they are uploaded, before the uploader has even had a chance to turn on monetization, let alone make said video Public. So why, with all these tools available to them, have they not been able to find content that is generally relegated to the Dark Web, and would never appear of such a popular site like YouTube?
I don’t get it, but the blow-back from this will be massive. I’m just glad I have a Vid.Me channel going, in case YouTube doesn’t survive this latest outrage. Because like it or not, this is an outrage. There is no excuse for this filth showing up on any respectable website, let alone YouTube. No excuse at all.