In its Q1 2024 Adversarial Threat Report, Meta Platforms Inc., the parent company of Facebook, Instagram, and WhatsApp, revealed on Wednesday that it had deleted 63,000 accounts linked to the infamous “Yahoo Boys” scam group.
The accounts, which were removed throughout the previous two weeks, were used to disseminate blackmail scripts and conduct financial sextortion schemes. According to Meta, a smaller network of 2,500 accounts connected to about 20 people mostly targeted adult males in the US using fictitious identities.
the company said it had identified and disabled these accounts using a combination of advanced technical signals and comprehensive investigations, which has enhanced its automated detection systems.
“Financial sextortion is a borderless crime, fueled in recent years by the increased activity of Yahoo Boys, loosely organised cybercriminals operating largely out of Nigeria that specialize in different types of scams,” the social media company stated.
The company further went on to say, “We’ve removed around 63,000 accounts in Nigeria attempting to target people with financial sextortion scams, including a coordinated network of around 2,500 accounts.”
“We’ve also removed a set of Facebook accounts, Pages, and groups run by Yahoo Boys—banned under our Dangerous Organizations and Individuals policy—that were attempting to organize, recruit and train new scammers.”
The company revealed during the investigation that it discovered that most scammers’ attempts were unsuccessful, and minors were target of some of these fraudsters. and These cases were reported to the National Center for Missing and Exploited Children.
In an effort to stop these scams on other platforms, Meta disclosed that it also shared information with other tech businesses through the Tech Coalition’s Lantern program.
Additionally, the facebook’s parent company reported that it has deactivated the sum 7,200 assets from Nigeria, including 1,300 Facebook profiles, 200 pages, and 5,700 groups that were offering resources linked to scams.
It clarified that these resources were discovered to be providing instructions and scripts for con games as well as connections to photo collections used to fabricate accounts.
Since the disruption, Meta’s systems have been actively thwarting these groups’ attempts to reappear, steadily enhancing their detection capacities.
The platform stated that it has been collaborating closely with law enforcement agencies as well, answering legal requests, warning authorities of impending threats, and helping investigations and prosecutions, according to the social media behemoth, its actions go beyond deleting accounts.
“We also fund and support NCMEC and the International Justice Mission to run Project Boost, a program that trains law enforcement agencies around the world in processing and acting on NCMEC reports.
“We’ve conducted several training sessions so far, including in Nigeria and the Cote d’Ivoire, with our most recent session taking place just last month,” the social media company revealed.
Meta revealed that it has tightened its messaging policies for users under 16 (under 18 in some countries) and that it posts safety alerts to promote responsible behavior online in an effort to better protect its users, particularly teenagers.
The Federal Competition and Consumer Protection Commission of Nigeria penalized Meta $220 million this week for many data protection legislation infractions connected to WhatsApp.
The inquiry, which began in May 2021, discovered that Meta’s privacy policies violated consumers’ rights by engaging in discriminatory activities and unapproved data sharing.
Meta intends to file an appeal, claiming that it is not in agreement with the ruling’s conclusions and the fine that was given. The FCCPC works to guarantee that Nigerian users are treated fairly and that local laws are followed.