Doing so sees the site, owned by Google, taking a clear stance on the gun control debate in the wake of the Parkland shooting, when expelled student Nikolas Cruz, 19, returned to his old high school in Florida on Valentine's Day and shot dead 17 of his former classmates, injuring 17 more.
That atrocity inspired surviving pupils from the Marjory Stoneman Douglas school to appeal for tighter gun ownership laws, an issue that has divided America, with many continuing to insist on their Second Amendment right to bear arms despite the high frequency with which mass shootings occur in the US.
In addition, YouTube will remove all videos selling weapon accessories, including bump stocks for semi-automatic rifles, the modifier used by Stephen Paddock in the Las Vegas massacre on 1 October last year, which saw him kill 58 people and injure a further 851 after firing into a crowd from the balcony of his hotel suite during a country music festival.
"We routinely make updates and adjustments to our enforcement guidelines across all of our policies," a YouTube spokeswoman said.
"While we’ve long prohibited the sale of firearms, we recently notified creators of updates we will be making around content promoting the sale or manufacture of firearms and their accessories."
Weapons vloggers attract large followings on YouTube, a search for "how to build a firearm" generating 25m results, according to Bloomberg.
One blogger has already had his account suspended this week while another prominent channel has decamped to adult video site PornHub in protest against the revised conditions.
"We suspect it will be interpreted to block much more content than the stated goal of firearms and certain accessory sales,” the National Shooting Sports Foundation said in response to the new restrictions.
"We see the real potential for the blocking of educational content that serves instructional, skill-building and even safety purposes.
"Much like Facebook, YouTube now acts as a virtual public square. The exercise of what amounts to censorship, then, can legitimately be viewed as the stifling of commercial free speech."
In the past YouTube has been reluctant to remove videos but has been forced to take a more active role in censorship of late, taking down clips endorsing terrorist causes and hate speech and banning neo-Nazi groups like Atomwaffen.
The company hired 10,000 new moderators in December to crack down on offensive clips and "fake news" being uploaded.
That team has already faced criticism, however, after it was alleged earlier this month that a number of pro-gun accounts had been removed erroneously, leading to accusations the site was attempting to purge itself of right-wing commentators on the quiet.
YouTube has meanwhile faced renewed political pressure to clamp down on inflammatory content in the UK this month, with a Parliamentary Home Affairs Committee labelling the site "a platform for extremism" over its failure to take down an offensive video by neo-Nazi organisation National Action.
The site recently announced plans to team up with third-party information resources like Wikipedia to debunk conspiracy theories and has stressed it is working to ensure a safe environment for users around the world.
The Independent
More about: YouTube