YouTube has been accused of being either incompetent or irresponsible in its handling of a video promoting a British far-right organisation.
The clip features a speech given by the banned Neo-Nazi group, National Action.
The chair of the Home Affairs Committee, Yvette Cooper MP, said YouTube had repeatedly promised to block it, only for it to reappear on the platform.
YouTube’s owner, Google, said it was tackling the problem.
“We do not want National Action content on YouTube and while we recognise our systems haven’t worked 100% in this instance we’re getting faster at removing violent extremist content by investing in machine learning technology and by hiring more people,” said a spokesman.
“We apologise for this failing and are committed to playing our part and being part of the solution.”
YouTube, Facebook and Twitter appeared before the Home Affairs Select Committee in December, when all three were accused of failing to censor National Action propaganda among other illegal hate speech material.
In a letter to the company, Ms Cooper wrote that she had flagged a video filmed in 2016 at a National Action demonstration in Darlington “at least seven times” with YouTube over the past year.
She said she had shown the footage to YouTube’s chief executive Susan Wojcicki herself, as well as alerting European public affairs chief Peter Barron and general counsel Kent Walker to the problem.
Despite individual instances of the clip being blocked as a result, Ms Cooper said at the time of writing that she was still able to find copies of it on four separate channels.
Moreover, she wrote, the “up next” section of three of the clips contained another white supremacist video, which was automatically played to the viewer.
“YouTube’s continued failure to deal with the same illegal extremist video is a complete disgrace – and shows the shocking lack of effort they have put into the most basic of their social and legal responsibilities,” Ms Cooper later said.
“If this was a copyright issue they would take it down immediately and automatically, and would invest in the technology to sort it out.
“We have raised this repeatedly at the most senior level so their executives cannot pretend not to know how serious this is.
“If they are too arrogant to act on illegal material when they are warned repeatedly, it’s time to bring in a system of strong fines as the committee recommended last year.”
The latest four examples presented by Ms Cooper have now been blocked.
The BBC was able to find a separate video with similar content that had been online since January, which has since been removed as well.
“I managed to see it just before it was taken down and while we don’t think it was 100% identical, we’re pretty sure it was the same voiceover with different images,” said George Perry, a press officer to the select committee.
The letter comes a week after the European Commission recommended that YouTube and other social networks be required to remove terrorist content within an hour of it being posted onto their sites.
YouTube has previously said its systems can detect nearly 70% of violent extremist content within eight hours of upload, and nearly 50% within two hours.