Elon Musk’s social media site X is suing Minnesota over a state law that criminalizes the use of AI-generated deepfakes to influence an election — and legal experts say the case raises constitutional red flags about the statute.
X Corp. argues in its lawsuit against Minnesota Attorney General Keith Ellison that the 2023 deepfake law violates its and other social media platforms’ free speech rights under the First Amendment.
The lawsuit, filed last month in Minnesota federal court, also says that the state law “directly contravenes” a 1996 federal law known as Section 230, which protects tech giants like X from civil liability related to the content posted by their users.
First Amendment and tech policy experts told Business Insider that the Minnesota law has major constitutional issues, with some predicting that the statute will ultimately be overturned in court.
The lawsuit makes a ‘strong’ case
“I’m not generally in the business of agreeing with Elon Musk, but when the argument is a good one, the argument is a good one, and I think the argument in this lawsuit is quite strong,” said Alan Rozenshtein, a University of Minnesota law professor.
The deepfake law, Rozenshtein said, is “very likely to get struck down on both constitutional and statutory grounds.”
Rozenshtein and other legal experts pointed out that political speech is the most protected form of speech and that lying is generally protected under the First Amendment.
“The government is not free to punish speech solely because it is false for the simple reason that a critical purpose of the First Amendment is to prevent the government from picking winners and losers, or truth and falsehoods, when it comes to speech,” said Colorado attorney J. Kirk McGill of the firm Hall Estill.
A deepfake video, McGill said, is “at its core, simply a lie” that falsely attributes words or actions to someone.
David Loy, the legal director of the First Amendment Coalition nonprofit, added that it’s “not the business of the government to use the force of law to punish speech on the ground that the government thinks is true or false.”
Loy said the Minnesota deepfake law has “significant First Amendment problems” and is similar to a 2024 California law, which his organization opposed in the legislature and was ultimately halted by a federal judge.
The Minnesota law makes it a crime for a person to knowingly disseminate a deepfake or enter into a contract or other agreement to disseminate a deepfake “made with the intent to injure a candidate or influence the result of an election” within 90 days before an election.
X’s lawsuit — which is seeking to block the law — says that under the statute, social media platforms that keep up content “presenting a close call” run the risk of criminal liability, “but there is no penalty for erring on the side of too much censorship.”
“This system will inevitably result in the censorship of wide swaths of valuable political speech and commentary,” X argues in the lawsuit.
The lawsuit highlighted how, in March 2023, an X user posted AI images depicting police arresting President Donald Trump.
“Thus, a social media company, like X Corp., could be accused of violating the statute — and potentially be subjected to criminal liability — for merely having these pictures displayed on its platform within the time periods set forth” under the law, the lawsuit says.
Eugene Volokh, a First Amendment scholar, said it’s most likely that X’s lawsuit will lead to the law being blocked for social media companies on the grounds of Section 230.
X would have a “solid defense” under Section 230 to any prosecution under the statute, said Volokh, a professor of law at the University of California, Los Angeles.
“A decision whether the statute would violate the First Amendment would therefore, I expect, have to await cases where people sue over their own rights to post such material,” Volokh said.
A Republican Minnesota legislator and a content creator have already challenged the deepfake law, but a judge denied their request for a preliminary injunction, and they’ve appealed the decision.
“While the law’s reference to banning ‘deep fakes’ might sound benign, in reality it would criminalize innocuous, election-related speech, including humor, and make social media platforms criminally liable for not censoring such speech,” X said in a recent statement. “Instead of defending democracy, this law would erode it.”
A Minnesota attorney general spokesperson told BI the office is “reviewing the lawsuit and will respond in the appropriate time and manner.
Minnesota Democratic state Sen. Erin Maye Quade, who authored the law, called the lawsuit “misguided” as she took a shot at Musk.
“Elon Musk funneled hundreds of millions of dollars into the 2024 presidential election and tried to buy a Wisconsin Supreme Court seat,” Quade said in a statement.
“Of course he is upset that Minnesota law prevents him from spreading deepfakes that meant to harm candidates and influence elections,” she said. “Minnesota’s law is clear and precise, while this lawsuit is petty, misguided and a waste of the Attorney General Office’s time and resources.”