the dorsal
Stock photo by Sansoen Saengsakaorat - taken by Kalilah Stein on friday, april 25, 2025
The Effect AI Has on Admissibility of Evidence
by Kalilah Stein April 25th, '25, 4:53 p.m.

Developing forms of generative AI, such as voice clones and deepfakes, call to question the admissibility of recordings in court. To determine the weight of AI on evidence reliability, it is important to know how these forms of AI work and what can be done to combat them.

On August 5, 1974, President Nixon released what would become known as the Smoking Gun tape by order of the Supreme Court. Its name is attributed to its being a pivotal piece of evidence in investigation of the Watergate break-in and subsequent cover-up.

The Rodney King trial in 1991 introduced recorded evidence which grew to have a central role in perception of the case.

Fairly recently, jurors were played audio of Donald Trump and Michael Cohen, recorded from Cohen’s phone, allegedly discussing hush money payment.

Audio recordings hold significant evidentiary value as, theoretically, they are more reliable than human memory. Elie Honig, former prosecutor, imparts “tapes are the prosecutor’s best friend,” seeing that defense lawyers typically resort to negotiating a guilty plea if they know their client is on tape.

Recorded evidence has long been viewed with the highest regard, but AI-powered voice cloning technology and video deepfakes are becoming increasingly sophisticated, making it difficult to determine authenticity, and therefore admissibility of once concrete evidence in court.

As all different forms of AI work to solve problems, we must grapple with the fact they also cause new problems in their respective fields.

Scot Mactaggart, Chief Innovation Officer at Dagastino Electronic Services shared his insights. He expressed that his apparent passion for his work does not lie in working with AI for AI’s own sake, but rather in discovering new ways to solve problems. Along with the problems AI can solve, Mactaggart acknowledges the new issues it creates. When asked if forms of generative AI, such as voice cloning and deepfakes, really are threats to the reliability of legal evidence, he affirmed “It definitely is. There’s no question.”

“Yesterday we couldn’t assume that a video could be doctored,” he said, “today we have to be worried that a video could be doctored.”

United States v. McMillan led to regulations for admissibility of recordings, including standards on capability of the recording device, preservation of the recording, identification of the speakers, chain of custody, and authenticity. When the 1977 case inspired directives on what authenticity should entail, they had no way of knowing the future deceptive capabilities of AI.

The use of generative AI has become very prevalent on social media and is now being used to create some commercials, so it is very likely that you have seen AI generated video or heard AI generated voice yourself, whether you knew it or not. As these fakes integrate into society more and more, deception becomes a greater concern.

This growing issue is already affecting courts today. In a lawsuit against Elon Musk’s multi-billion-dollar company, Huang v Tesla, a video statement made by Musk was used against him. His team combats this by saying “[Musk], like many public figures, is the subject of many ‘deepfake’ videos and audio recordings that purport to show him saying and doing things he never actually said or did.”

It is no lie that Musk is a target for deepfakes. In fact, doctored videos of Elon Musk are contributing to billions of dollars of fraud losses as they are commonly used by scammers.

Another instance displaying the matter at hand challenges a voice recording. David Notowitz, founder of the National Center for Audio and Video Forensics, tells of having a client claiming a 911 call of his voice is fake. Notowitz attempted to create a similar fake call using AI and a sample of his client’s voice. He was able to accomplish this task quickly, using readily available tools.

It does not require fame to be the subject of a deepfake or voice clone. Online videos, speeches, conference calls, phone conversations, and social media posts can all be used to gather the data needed to train a system to clone a voice. Anyone can fall victim.

So, how must this daunting new level of deception be countered? First, by recognizing that AI is only a new mode of change, like other developments society has grown accustomed to. Then, by learning what AI really is and how it works.

Instinct is an entailment of life: all creatures possess it. While the instinctual laws of many living beings are specific to survival, human desire refuses to confine us to such simplicity. By virtue of our humanity, we yearn for knowledge, discovery, and fulfilment. Human passion compels us to seek more.

Though many eras, like the Enlightenment Period and the Industrial Revolution, have been credited for their transitional stages of societal improvement, we are, and always have been, in a constant, irrevocable, state of progression and advancement.

Human intelligence can now create intelligence of its own, giving inanimate objects the faculty to think and learn. And AI is not going away, it will only continue to advance and develop due to human habit.

“Right now, people talk about being an AI company. There was a time after the iPhone App Store launch where people talked about being a mobile company. But no software company says they’re a mobile company now because it’d be unthinkable to not have a mobile app. And it’ll be unthinkable not to have intelligence integrated into every product and service. It’ll just be an expected, obvious thing.” Sam Altman asserts as the co-founder and CEO of OpenAI.

AI is going to affect all aspects of life. So, as AI adapts, so too must everyone else. To do this, staying informed is key.

Notowitz urges attorneys to stay on top of AI news and be aware of all the possible ways it can be used in a case. “We live in fast-changing times” he says, “when the legal field rarely keeps up with technology. It’s our job to know about the advancements in AI and other technology.”

Artificial Intelligence is an umbrella term, as it encompasses many different branches. Our own brains have many different branches of function: the brain is responsible for drawing conclusions from the information it is fed and for producing our voices, for instance. AI works the same way. Predictive AI draws conclusions from past events to make predictions about future events, while Generative AI creates new content, such as voices.

Again, alike to our own brains, Generative AI produces sound and video using neural networks. Neural networks are trained on data sets. When cloning a voice, samples of that person’s voice make up the data set. These networks learn to identify the unique features of a person’s voice, such as pitch, tone, accent, and inflection, so text input results in an accurate output voice.

Despite AI’s similarities to the human brain, it cannot express complex human emotion or spontaneity. Human intelligence does not follow set rules and algorithms; it’s about the human experience. It’s important to remember, AI in itself is not the opposition. AI does not have the capacity to fight back on its own. Arms should rather be taken up against those who use AI for deception.

While AI assists in forensic advancements which aid in the gathering of evidence, it is evident that it also creates cause for concern regarding its effect on legal evidence. Ramifications depend upon the ways humans choose to utilize AI technology. “If we don’t use it for the right things, then I’m going to end up hating AI,” Mactaggart articulated his confliction.

To combat deceptive AI, forms of blockchain technology, which can make recorded data tamper-proof, track audio and video file origins, and apply digital watermarking, can be used.

“It’s an arms race. The bad guys come out with ‘I can use this technology to pretend to be somebody else,’ so now the good guys have to come out and say ‘we’re gonna put watermarks on video to prove that it came off a camera.”’ Mactaggart explained. “The bad guys get better tools and then the good guys get better tools and then the bad guys get better tools,” and the cycle continues indefinitely.

Mactaggart described three aspects to prove identity in the world of security. These are, something you have, something you are, and something you know.

Your voice is something you are. The sound of your voice is determined by bodily acoustics. Mactaggart compares voices to fingerprints: unique to each individual. Experts can catch fraudsters simply by determining if their voice matches with the way it should sound based on the bodily acoustics gathered from height and weight on the ID of the person they say they are.

Neural networks consist of multiple layers of nodes, each associated with their own weight and threshold value. The weights, determined by sound wave frequencies, are adjusted to modify the voice as it travels through node layers.

For complex tasks, such as voice generation, diverse node functionality is crucial for accuracy. If the nodes produce similar outputs, the model fails to capture the diverse range of sounds, inflections, and pauses present in natural human speech. This loss of variability and richness is called over-smoothing. The lack of texture in an over-smoothed voice makes its inauthenticity detectable.

Dagastino Electronics Services (DES) worked with AI technology to combat spear phishing—a method used by fraudsters. Their tech learns the diction of individual people so it can detect if someone is not who they say they are. The same idea can translate into detecting voice clones and deepfakes. General rule of thumb is you are not going to find “smoking gun” proof of falsity. You must piece a collection of clues together.

The tedious work that goes into determining authenticity of video and audio recordings is cause for concern in the convenience department. Mactaggart warned “this is why you’re going to see some push and pull, like a tug of war between convenience and security.” But, again, we must adapt.

“And you know, if you wake up every day and you’re committed to being a bad guy, now there’s gonna be somebody on the other side who’s committed to stopping you forever. It doesn’t change with technology.”

In short, video and audio recordings have long been valuable pieces of legal evidence, but recent AI advancements have called their reliability to question. To contend with this, everyone must stay informed and adapt. The battle is not against AI advancements themselves, but rather people who use them negatively. Blockchain technology and other methods can be used to decipher authentic recordings from doctored, combating deepfakes and voice cloning. The legal system is not the only area of life AI affects. Every aspect of life is ever-changing, and all anyone can do is acclimate.

go back to home page
By Kalilah Stein in grade 11