Meta, the owner of social platforms Facebook and Instagram, has been accused of running ads that directed users to online marketplaces to buy drugs and other illegal substances (collectively known as controlled substances). This comes as Meta is facing an investigation in the US.
Meta continues to make money from ads that violate its own policies prohibiting the sale of illegal drugs, according to a WSJ investigation in July. Hundreds of ads promoting illegal substances like cocaine and opioids continue to appear on Facebook and Instagram, according to the Wall Street Journal. The ads show images of prescription bottles, pills, blocks of cocaine, or images with calls to order. Since March, US federal authorities have been investigating Meta for its role in selling illegal drugs.
The nonprofit Tech Transparency Initiative (TTP), which investigates online platforms, reviewed Meta’s ad library between March and June and found more than 450 illegal drug ads on Facebook and Instagram. Katie Paul, director of TTP, said users could buy and sell dangerous drugs or even scams directly on Facebook without going through the dark web. Mikayla Brown is one of the parents who believes Meta should be held responsible for her child’s overdose death.
Her son, Elijah Ott, 15, a California student, died in September 2023. An autopsy found that Ott tested positive for large amounts of fentanyl, which was determined to be the cause of his death. Brown also found messages on her son's phone linked to an Instagram account that sold illegal drugs. In some cases, ads on Facebook and Instagram linked to private group chats on Meta's encrypted messaging service WhatsApp, from which addicts could easily buy illegal substances. US lawmakers have discussed the need to hold tech companies responsible for what third parties post on their platforms.
The Justice Department has expanded the reach of federal drug laws to hold internet platforms accountable when companies using them violate the law. At a Senate hearing in January, some parents said Meta and other social media companies should be held responsible for the deaths of their children. Meta said it uses artificial intelligence (AI) tools to moderate ads on Facebook and Instagram, but existing tools have not been able to block drug ads, while ads often redirect users to other platforms where they can make purchases.
Meta is working with law enforcement to combat this type of activity, a company spokesperson said. The company’s content moderation teams have been overwhelmed by staff cuts in recent years. Meta expressed condolences to those who have suffered the tragic consequences of drugs and recognized the need to work together to prevent illegal substances.
KHANH MINH
Source: https://www.sggp.org.vn/mang-xa-hoi-bi-cao-buoc-quang-cao-chat-cam-post752172.html
Comment (0)