The web is a difficult beast to tame, with monitoring content near enough impossible. YouTube, for example, receives almost ten years’ worth of uploads every day.
So regardless of how much money parent organisation Google has, there’s never going to be enough to fund censorship of that amount of footage. This means the video sharing site relies on users to flag up inappropriate clips, and alert the correct authorities, who then remove any offending items. Hence the vast amount of film and TV excerpts that aren’t viewable due to copyright infringements.
Unfortunately though it doesn’t always work, as there’s no guarantee ‘right minded’, morally upstanding folk will come across anything untoward, which means there’s no guarantee the censors will be alerted. Furthermore, there have also been criticisms abound over the time it takes for objectionable materials to be taken offline by the powers behind the whole Broadcast Yourself thing. This has been reported on the Smoking Gun blog before (most recently here), and now today more dirt has hit the proverbial fan within the context of what can, and shouldn’t be viewed on the internet’s favourite video platform.
According to The Guardian today, neo-Nazi groups have been benefiting from YouTube’s revenue-sharing system, which is far from good news when it comes to public perception of the brand, obviously. Companies such as Virgin Media, O2 and BT are signed up to this service, which places their ads before non-copyrighted clips on the website. This means that if you made a short film, commercials from a major corporation could appear each time someone clicks to view, prior to the actual video, earning a little cash for the account holder hosting the content.
Needless to say it’s a tiny figure, but with repeat viewings over months there can be a small but noticeable return on camera-led efforts. Which is exactly what the likes of Combat 18 and Blood & Honour, two far-right organisations, realised when they began taking advantage of this model. In short, by posting original videos and signing up to host sponsored content they have been receiving payment, albeit indirectly, from those household names referenced above (along with a few more to say the least).
A rather large blunder has occurred, or perhaps more accurately a gaping loophole in the system has been exploited then. Of course, we understand the difficulties in policing YouTube, or any other address dominated by user-generated material, but still, this situation will not bode well when it comes to convincing advertising buyers the site is a reliable, safe platform to be trusted with brand names in the future- even though everyone knows the companies that ran adverts didn’t have a say in where they were placed.
Another blow for the perceived business value of social media in the wake of Facebook’s share debacle then? Well, possibly. But, more importantly, in a time when there’s already an increasing amount of concern being voiced over what appears in cyberspace (irrespective of whether it comes down within five minutes or five days), this represents more bad tidings for anyone in agreement with the defence of online freedoms.