Whilst MP's might be complaining that the "dark side of the Internet" needs to be policed to stop children viewing offensive and disturbing material, the idea that a site like YouTube should actively filter it's content with intervention review is utterly absurd and completely impractical.
YouTube has literally millions of users. The userbase uploaded over ten hours of video every minute of everyday. In order to have a reviewal system of that content you'd have to employ literally thousands upon thousands of staff in order to be able to watch it and make the site remain viable.
Given the way revenue is generated across YouTube, and for that matter it's parent company Google, the notion that you can have an active system rather than the system currently in place where people report inappropriate content and then it is removed is just silly.
The report from MPs has cited a video of gang rape to make it's point. Apparently this video was viewed by 600 people before it was remeoved from the site. That's 600 people from a 20 million unique users per month. That's a staggering 0.003% of the monthly userbase of YouTube that saw the video.
Think about this for a second, someone who wants to watch a video of gang rape is going to find that video if they want too. It's also worth noting that the MPs are not calling for Usenet to be policed in this way and the amount of content there that is truly disgusting is immense.
On a global network like the Internet the only way to police these sort of things is to find them then delete them. The content addition rate would grind to a halt if everything had to be reviewed first. No doubt though the calls for such things will continue.
No one is denying that there is much out there that is twisted, dark, bizarre and sickening. The problem though is not something that can be solved by destorying companies by making them employ numbers of staff that would economic suicide. User generated content is policed by users.