25 Investigates: Advocates warn legal loophole enables AI-fueled exploitation of children in Mass

This browser does not support the video element.

BOSTON — Massachusetts is one of just five states that does not criminalize the creation or possession of AI-generated or computer-generated child sexual abuse materials.

Technology is outpacing Massachusetts state laws. As artificial intelligence becomes more sophisticated, child advocates warn the online exploitation of children is accelerating, and they say lawmakers must act quickly to close a dangerous legal loophole.

“People can easily take images of children and make them appear sexually explicit,” said Lindsay Hawthorne of the child advocacy group Enough Abuse.

“And I think a lot of people just don’t realize that this technology exists.”

Boston 25 News anchor Kerry Kavanaugh asked Hawthorne whether there are criminal consequences for creating these images in Massachusetts.

“If someone is making an image like this, do they face any criminal consequences in Massachusetts?” Kavanaugh asked.

“For creating or possessing them, they don’t,” Hawthorne said.

There are penalties for sharing sexually explicit deepfakes of children and adults under the state’s so-called ‘revenge porn’ law.

But Massachusetts currently has no law that specifically criminalizes the creation or possession of child sexual abuse materials generated using artificial intelligence.

“That is shocking to me as a parent, as a person,” Kavanaugh said.

Hawthorne said Massachusetts is now far behind most of the country.

“And forty-five other states have already passed bills to close this loophole and edit their child sexual abuse material or child pornography statutes,” Hawthorne said. “So, Massachusetts needs to get on board.”

The National Center for Missing and Exploited Children the risks are complex and widespread, especially with the rapid rise of AI-generated content.

In 2024 alone, NCMEC reported a 1,325% increase in CyberTipline reports involving generative AI technology.

Lawmakers say there is a solution already on the table.

State Sen. Michael Moore of Worcester has filed legislation that would expand the state’s definition of child sexual abuse material to include content created using digital tools and artificial intelligence.

“My bill would clarify that child sexual abuse material that is created in whole or part by digital methods, including the use of artificial intelligence would fall under the child sex abuse act,” Moore said.

The bill would expand the definition of abusive material to include any “photograph, film, video, picture or computer-generated image or picture depicting sexual conduct.”

Moore said the harm caused by AI-generated material is real and personal.

“AI has to get the image from somewhere,” Moore said. “What if it’s your child whose image is being depicted?”

Moore’s bill currently sits in the Senate Committee on Ways and Means.

A possible wrinkle is the President’s executive order prohibiting states from any form of regulation dealing with artificial intelligence for a 10-year period. Moore says it is possible child sexual abuse laws could be exempt from that order. We’ll track it for you.

“These problems are so prevalent and so real, and they have easy solutions,” Hawthorne said. “So I hope that our legislature will act to pass these bills.”

This is a developing story. Check back for updates as more information becomes available.

Download the FREE Boston 25 News app for breaking news alerts.

Follow Boston 25 News on Facebook and Twitter. | Watch Boston 25 News NOW