Repulsive Content on YouTube? We Smell a RAT
If you’ve ever placed a Post-It note over your laptop’s webcam “just in case,” this one’s for you: Digital Citizens Alliance has found hackers not just hijacking users’ computer cameras, but posting tutorials showing how anyone can accomplish this unnerving feat.
What makes this even more galling is that the videos aren’t appearing on some deep, dark corner of the web, where online bad guys have to dig to find them. No, instead you – and everyone else – can find such hacking videos on YouTube, the world’s biggest video sharing platform.
The report highlights a hacking technique called the Remote Access Trojan – or RAT – which allows the intruder to take control over many aspect of a user’s computer, from speaker and microphone to the webcam, which lets them peer into the world of anyone who hasn’t covered up their camera. Unsurprisingly, that’s most of us, but that trust is clearly misplaced, thanks in no small part to YouTube’s permissive upload parameters.
When it comes down to it, we shouldn’t be surprised to see this kind of content on YouTube, a Google-owned company. Both video site and search engine results have been home to illegal and unpleasant content for years, with Google preferring a flag-and-remove system to any kind of responsible policing of the sites and content it serves up.
This year alone, YouTube has been criticized for a series of questionable content uploads, including:
- Allowing ISIS torture videos to be uploaded to the problem, and permitting ad revenue to flow to the channel’s creators,
- Targeting children with ads for gambling services and junk food,
- Providing a home for the kind of racism and misogyny that has seen social site Reddit get in all kinds of trouble.
With a track record like this, and with years of failing to remove objectionable and illegal content from its results, are we really surprised to see Google properties profiting from this kind of unpleasantness?
One Step Forward, Two Steps Back
Following the widespread media coverage of this problem, YouTube has unsurprisingly removed the offending content. That’s the step forward, if it can be called such.
Unfortunately, this is shutting the digital stable door after the hacker has bolted.
The phenomenon is now widely covered, which means it is widely known. Even if YouTube and Google manage to filter their results to remove such tutorials, which would not be in keeping with their past responses, budding hackers will now be more inclined to dig up these guides elsewhere.
The rabid cat is out of the bag and YouTube had the biggest hand in opening it.
The original Digital Citizens Alliance report is entitled Selling Slaving, which highlights another objectionable part of this practice: the fact that intermediaries that host this content, including YouTube, are often allowing advertising income to flow to the creators. As middle-men they also take the lion’s share of that ad revenue, which is clearly dirty money.
As the earlier examples demonstrate, this is no isolated incident. Rather, it is a byproduct of a poorly managed content system that allows almost anything to be uploaded, viewed, and even to accrue ad revenue, before eventually being flagged for removal and taken down. As the frustratingly ineffective DMCA process shows, this barely puts a bump in the way of pirates and hackers, let alone a barrier.
As leaders in this space – and primary gateways to any kind of online content imaginable – those behind policy-making at Google and YouTube need to finally show some leadership in managing what makes it onto their platforms.
If they fail to do so after so many high-profile oversights in little more than six months, it will only lend credence to the allegations from creative rights advocates, that Google is all too willing to turn a blind eye to illegal content when there’s a buck or two to be made.