"This means that, because of cross-check, content identified as breaking Meta's rules is left up on Facebook and Instagram when it is most viral and could cause harm," the board said.
Meta also failed to determine whether the process had resulted in more accurate decisions regarding content removal, the board said.
Cross-check is flawed in "key areas," including user equality and transparency, the board concluded, recommending 32 changes to the system.
Content identified as violating Meta's rules with "high severity" in a first assessment "should be removed or hidden while further review is taking place," the board said.
"Such content should not be allowed to remain on the platform accruing views simply because the person who posted it is a business partner or celebrity."
The Oversight Board said it learned of cross-check in 2021, while looking into and eventually endorsing Facebook's decision to suspend former US president Donald Trump.
In a statement Tuesday, Facebook Vice President for Global Affairs Nick Clegg said the firm has agreed with the board to review its recommendations and respond within 90 days.
He said in the past year, Facebook has made improvements to the process, including widening eligibility for cross-check reviews while also implementing more controls on how users are added to the system.
"We built the cross-check system to prevent potential over-enforcement... and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe," such as journalistic reporting from conflict zones, Clegg said.
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)
Track Latest News Live on NDTV.com and get news updates from India and around the world