Skip to content

Philippine government targets Facebook for online baby selling

Philippine government targets Facebook for online baby selling
By Newsroom
May 22, 2024 10:24 AM

The Philippine government is urging Facebook to address the illegal selling of babies and children through its platform.

This call comes amid ongoing efforts by authorities to dismantle syndicates engaged in this illicit trade. Social Welfare Secretary Rex Gatchalian criticized Facebook for its “lack of responsibility” in handling reports of human trafficking violations.

He noted that Facebook has not responded to a letter sent by the National Authority for Child Care (NACC) in 2023, which requested the removal of pages involved in the illegal activity.

Gatchalian stated that the unregulated content on Facebook allows mothers to “sell” their children, which he described as a form of child exploitation and human trafficking.

He urged the public to report any instances of human trafficking to aid in regulation.

Philippines determined to battle baby and child trafficking

In collaboration with the Philippine National Police (PNP) Women and Children Protection Center, the NACC has been actively monitoring 20 to 40 Facebook pages suspected of engaging in baby and child trafficking. These private pages, with thousands of followers, are used to sell newborns and children under the guise of adoption.

The urgency of the situation was asserted by a recent entrapment operation in Cavite, where two individuals were arrested for attempting to sell an 8-day-old baby. The suspects have been charged under the “Anti-Trafficking in Persons Act,” which carries severe penalties, including life imprisonment and substantial fines.

Undersecretary Janella Estrada of the NACC emphasized the legal pathways for adoption, which are free of charge upon completion of the necessary documentation and screening. She also called for a massive information campaign to raise awareness about legal adoption processes and the dangers of illegal child selling.

Facebook’s content moderation issues

The issue with Facebook’s content moderation is not new. The platform has faced criticism for its unreliable automated moderation systems. Previous instances, such as in Canada, showed how Facebook’s algorithms failed to remove advertisements for illegal substances until prompted by external reporting.

Facebook’s parent company, Meta, has stated that its policies prohibit ads promoting the sale of pharmaceutical and non-medical drugs and that it removes such content when detected.

However, the volume of content necessitates the use of machine learning algorithms, which sometimes fail to escalate reports for human review.

Gatchalian and Estrada both stressed the need for continued engagement with Facebook to regulate the use of its platform for such illegal activities.

They also called on the public to assist by reporting any suspicious activities related to human trafficking.

Last Updated:  May 31, 2024 4:03 PM