© 2024 88.9 KETR
Public Radio for Northeast Texas
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Parents, law enforcement push for stricter AI child pornography laws in Texas

FILE - In this Dec. 12, 2016, file photo illustration, a person types on a laptop in Florida.
Wilfredo Lee
/
AP
FILE - In this Dec. 12, 2016, file photo illustration, a person types on a laptop in Florida.

One October morning last year, Anna McAdams’ 14-year-old daughter woke up to a flurry of text messages from her friends. Someone was sharing nude pictures of her around Aledo High School, they said.

McAdams said it was a 15-year-old male classmate who targeted her daughter and eight of her friends by grabbing photos from their Instagram pages, editing them into naked bodies and distributing them on Snapchat. She said he made them using an app specifically designed for making nude images.

“All she can think about is, ‘what if I go to college and the school pulls these nude photos of me up? Or what if I go for a job and they see these?’” she said. “If you were to see these images, you would think they were child pornography.”

The Texas Senate Committee on Criminal Justice heard from McAdams, law enforcement, technology experts and child welfare advocates Thursday on how to best tackle the growing use of generative artificial intelligence and deepfake technology in creating child pornography.

It’s one of several interim charges Lt. Gov. Dan Patrick delegated to senators to tackle before next year’s legislative session.

The legislature passed at least three bills last session addressing the issue. A key law was House Bill 2700, which expanded the definition of child pornography to include visual material that uses an actual child’s image to create pornography, including content created with artificial intelligence.

Law enforcement officials told senators despite the progress made with HB 2700 and other bills, the rapidly shifting nature of AI leaves areas for tighter regulation. For instance, it can be hard to track down whether a real image of a person under 18 was used to feed an AI generator and create pornographic material, said Steven Stone, a technical captain with the Texas Department of Public Safety.

“If I take a picture of someone in this room, use an AI generator to regress them into an earlier age, and then subsequently feed those images into my AI generator and produce child pornography images from those, we've now created child pornography images from images that are not of real people,” he said.

The Senate Committee on Criminal Justice met on June 6, 2024 to discuss generative artificial intelligence and "deepfakes" in the production of child pornography.
Texas Senate
/
Screenshot
The Senate Committee on Criminal Justice met on June 6, 2024 to discuss generative artificial intelligence and "deepfakes" in the production of child pornography.

Right now, it’s unclear whether these crimes share common elements that can be widely prosecuted with new laws.

Enacting new laws would also risk walking a constitutional tightrope. In a 6-3 decision, the U.S. Supreme Court ruled in 2002 it's unconstitutional for laws to ban all computer-generated content that isn't legally obscene and that appears to be — but isn’t actually — child pornography in the case Ashcroft v. Free Speech Coalition.

But some witnesses, including Tarrant County Assistant District Attorney Lori Varnell, expressed skepticism of that ruling — and believed it was outdated.

“That case created this idea that child pornography should only be illegal because the child in the image is being harmed, and because it creates a market to harm children and take pictures of it, it should be illegal,” Varnell said. “I'm here to suggest that that is not a sufficient protection for our children.”

Varnell recommended content-based regulation, and suggested bumping possession of child pornography up from a third to a second-degree felony based on more egregious depictions of sexual abuse.

The biggest concern, she said, is examining the ways even fake child pornography may turn into real-life abuse.

“Watching child pornography does not prevent child touching,” Varnell said.

Other experts outlined the ways companies who create or use AI applications can self-regulate. Those include testing AI models to find any loopholes that allow sexually abusive material to slip under the radar or even using code that can easily flag child pornography so it can be taken down and reported.

Similar legislation is pending in Wisconsin and California that would punish child pornography crimes only if they're depictions derived from an image of a real child. Those provisions could help ensure any new laws pass constitutional muster, said Carl Szabo, vice president and general counsel with the internet free speech group NetChoice.

"If it is completely fake, that unfortunately gets us into First Amendment issues," he said. "And I don't want to see bad actors get off the hook."

Szabo also suggested defining a deepfake under Texas law as anything that misrepresents or causes someone to believe it’s a real image to cover a wide range of content.

He and others were asked to draft proposed legislation Committee Vice Chair Phil King said lawmakers could take up as soon as the next legislative session.

“The way we kind of work off here is we've got to get things down from 50,000 feet, down to what we can do that's actually actionable,” King said.

Got a tip? Email Toluwani Osibamowo at tosibamowo@kera.org. You can follow Toluwani on X @tosibamowo.

KERA News is made possible through the generosity of our members. If you find this reporting valuable, consider making a tax-deductible gift today. Thank you.

Copyright 2024 KERA

Toluwani Osibamowo