At a rapid, ongoing pace, artificial intelligence’s ability to mimic voices, actions and appearances is becoming stronger and more convincing than ever. But the rate at which the machines are learning — and relearning — has far outpaced any legal process to contain them.
With recent, real-world examples to consider, including fake robocalls of President Joe Biden dismissing voters and a phony MP3 of a Baltimore County principal, state and federal legislators are working to update the law and make AI “deepfakes” a defining factor in different crimes.
On Monday, in a 409-2 vote, the U.S. House of Representatives passed the Take It Down Act, one of the country’s first substantial efforts to criminalize harmful AI content. Under the law, publishing or threatening to publish intimate photos of a person without their consent, including deepfakes, will become illegal. And if a victim notifies a platform of a violation or forgery, it must be removed.
The day that bill moved onto the president’s desk, however, prosecutors in Baltimore County lamented their inability to more harshly punish Dazhon Darien, an educator who used an AI-generated recording to target the reputation of an administrator at Pikesville High School.
Deputy State’s Attorney John Cox said it was “ironically fortunate” that Darien worked at a school. Otherwise, the misdemeanor he was convicted of, one generally reserved for students causing disturbances in class, couldn’t have been charged.
“I would think that almost everybody would think if it’s not a crime, it ought to be a crime,” Cox said of the deepfake audio.

Facing termination for poor performance, Darien used an online subscription service last year to manipulate and project Principal Eric Eiswert’s voice into a racist, anti-Semitic conversation. Prosecutors said the false file drew data from a secret recording Darien took in the principal’s office. Once uploaded, Eiswert’s voice became a digital puppet, ready to say anything typed into a box.
The recording quickly went viral, and by the end of the school year, Eiswert was forced out of his position. Though he later became the principal at a nearby middle school, Eiswert sued the Baltimore County Board of Education and Darien.
But the deepfake itself provided no path for the state in criminal court.
“I think the breadth of possibilities in the actions of an individual ought to make available much more harsh punishment … because, I mean, something like this can have incredibly adverse effects on the community and individuals,” Cox said.
Several bills introduced in the General Assembly this year looked to insert AI forgeries into the law, updating the code “in places where we think they’re being used,” said Democratic state Sen. Katie Fry Hester. One involving revenge porn succeeded. But another, inspired partly by what happened at Pikesville High, never made it out of committee.
House Bill 1425 sought to make identity fraud an applicable felony charge in cases like Darien’s, criminalizing the use of AI to “impersonate, falsely depict, or claim to represent another person” to defraud or harm that person or others. Also, under the proposal, a victim could pursue an injunction in civil court to stop the deepfake quickly instead of waiting for an outcome in criminal court.
“The most important step is often to stop the distribution,” said Maryland Deputy State Prosecutor Sarah David, “and that’s what this would do.”
Speaking at a March 11 hearing before the House Judiciary Committee, Cox told the panel that the current law focuses too heavily on financial schemes and fails to protect those whose identities are stolen for the purpose of inflicting “serious emotional distress.”
“It is our position that we need to get ahead of that, especially with the development of AI,” he said.
Outside of what happened to the Baltimore County principal, advocates for the bill argued its usefulness in combatting “grandparent schemes” — where someone pretends to be a grandchild caught in an emergency to get money from a relative — and political fraud, the latter of which has been a topic of debate and action by several legislatures across the country.
According to data compiled by Public Citizen, which says AI presents “a myriad of serious threats” to democracy, more than 20 states have enacted some form of prohibition on the use of deepfakes in elections.
This year in Maryland, Senate Bill 361 looked to amend its election code to bar people from using AI to impersonate a candidate. Hester, its sponsor, said a “very good draft” was made, but lawmakers “just ran out of time” to push it to Gov. Wes Moore.
“People should have a chance to vote on a candidate based on the truth,” she said, “not spend their time deciphering, you know, is this a deepfake or is this real.”
Whether state legislators will be able to give the election bill another try may depend on legal challenges to similar laws across the country.
In late April, Elon Musk’s social media platform X became the latest party to sue Minnesota over its version of the ban, arguing it was a free speech violation that would “inevitably result” in “wide swaths” of political censorship.
The website’s specific complaint that Minnesota had made its law too vague mirrored criticisms by Maryland’s Public Defender’s Office over the proposed changes with AI in criminal statutes.
“While the ease with which deepfakes can be generated by virtually anyone with access to a computer is truly alarming, this bill is too broad,” Andrew Northrup, a senior attorney in the office’s forensics division, said of the identity fraud bill. “It may have a chilling effect on free speech at a time when the risk of free speech being curtailed is greater than ever.”
Advocates for House Bill 1425 said they modeled the bill after another statute in New Jersey to ensure that it would “criminalize the intent” of the deepfake, not its creation.
“People can mock one another. They can make memes,” David said. “The internet is full of fake images, but this holds people accountable for actually intending to make people think that it’s real.”
Hester, who represents Howard and Montgomery counties, has tried to modernize Maryland’s code to consider AI over the past two years. She sponsored the revenge porn bill approved by the governor and said she plans on refiling those that failed in 2026.
“This stuff is hurting real people,” she said.
Have a news tip? Contact Luke Parker at lparker@baltsun.com, 410-725-6214, on X as @lparkernews, or on Signal as @parkerluke.34.
from Baltimore Sun https://ift.tt/tL19yxc
via IFTTT