Snap-owned Snapchat, and its role in the fentanyl crisis, was the focus of a House roundtable hosted by the Energy and Commerce Committee on Wednesday that could set the stage for new proposals to protect kids on the internet or limit the liability protections for online platforms.
The roundtable featured the mother of a child who died after taking a drug containing fentanyl allegedly purchased over Snapchat, apparently believing it was a prescription painkiller. It also featured two lawyers who litigate such cases against tech companies as well as a sheriff from Washington state who has investigated fentanyl deaths.
Witnesses testified at the hearing on Wednesday that Snap’s popular photo and texting app, known for its disappearing messages, was uniquely designed in a way that attracted drug transactions.
“Big Tech has many problems,” said Carrie Goldberg, a lawyer who works on cases seeking to hold tech platforms accountable for often offline harms. “But the lethal fentanyl sales is not a general Big Tech problem. It’s a Snap-specific problem. Snap’s product is designed specifically to attract both children and illicit adult activity.”
Goldberg expressed concern with Snapchat’s disappearing messages, anonymity and real-time mapping features, which users have to turn on for their friends to see their location.
Bloomberg reported Wednesday that the Federal Bureau of Investigation and Department of Justice are also investigating Snap’s role in fentanyl sales. DOJ declined to comment and the FBI did not immediately respond.
Lawmakers are concerned about other platforms like Facebook Messenger, too. “This is not just Snapchat,” Rep. Gus Bilirakis, R-Fla., said. “It’s all these social medias.” Bilirakis pointed to two examples of someone buying a fentanyl-laced drug over Facebook Messenger, for example.
A Facebook spokesperson wasn’t immediately available for comment.
The Energy and Commerce Committee, now led by Rep. Cathy McMorris Rodgers, R-Wash., votes on legislation across topics including privacy, consumer protection, content moderation and health.
McMorris Rodgers has indicated that under her stewardship, the panel will seek to significantly narrow liability protections for tech platforms, which advocates on the panel suggested should be done in the case of wrongful death lawsuits.
A document from last year laying out Republican priorities for the committee suggests they should “Scrap 230,” the law that shields platforms from liability for their users’ posts, and start over to create what they see as a less politically biased standard. McMorris Rodgers has also expressed interest in exploring tech impacts on kids’ health, including mental health, in the past.
A Snap spokesperson said the company is “committed to doing our part to fight the national fentanyl poisoning crisis, which includes using cutting-edge technology to help us proactively find and shut down drug dealers’ accounts.”
The company blocks search results for drug-related terms and redirects users to expert resources about fentanyl risks, the spokesperson added. The company has said it’s made improvements to parental supervision features and machine learning to proactively catch illicit sales and has made it harder for adults to find teens to connect with unless they have multiple friends in common. It said, of drug-related reports from users, those specifically about drug sales have fallen from about 23% in September 2021, to about 3% in December 2022.
“We continually expand our support for law enforcement investigations, helping them bring dealers to justice, and we work closely with experts to share patterns of dealers’ activities across platforms to more quickly identify and stop illegal behavior,” the spokesperson said. “We will continue to do everything we can to tackle this epidemic, including by working with other tech companies, public health agencies, law enforcement, families and nonprofits.”
Laura Marquez-Garrett, an attorney with the Social Media Victims Law Center, disputed some of Snap’s claims, saying despite what the company has said, many of the kids who have died from fentanyl overdoses were not actively looking for drugs and the company has not sufficiently preserved data for law enforcement to use in such investigations.
Goldberg called Section 230 the “main hurdle” in holding tech companies liable for harm to their users. That’s because it does not incentivize safety features, she said, and also prevents tech platforms from reaching the discovery stage in many cases, which could otherwise reveal internal information.
Spokane County Sheriff John Nowels said that his office invests heavily in tech expertise to help investigate fentanyl transactions, including on other encrypted services. He added that dealers will often have profiles on other platforms too, but they will point consumers to their Snapchat accounts from there. He said it’s “short-lived” once dealers realize other platforms are cooperating with law enforcement.
Nowels said the lack of laws around how tech services should retain information and hand it over to law enforcement, as well as end-to-end encryption that obscures messages except for between the users talking to each other, makes it harder for investigators to trace back the source of illegal drug deals. But legislation weakening encryption for law enforcement investigations would also likely be at odds with the committee’s other goal of increasing digital privacy protections.
Subscribe to CNBC on YouTube.
WATCH: Lawmakers grill TikTok, YouTube, Snap executives