DSU research team takes deep dive into DeepSeek
April 30, 2025
This piece is sponsored by Dakota State University.
A research team at Dakota State University has taken a deep dive into the artificial intelligence company DeepSeek.
DeepSeek is a large language model that reportedly is faster than similar apps, so it gained a lot of attention after its initial release, but that soon spiraled into questions on data privacy.
“When it first came out, there was concern because it’s a product of a Chinese company,” said Will Campbell, digital forensics analyst with DSU’s digital forensics lab. Many countries have restrictions on the app, including Australia, Canada, Netherlands, South Korea and Italy, along with the Navy and NASA, he added.
Digital forensics analyst Will Campbell is originally from Aberdeen and earned his bachelor’s and master’s degrees from DSU. Dr. Arica Kulm, the director of digital forensics, earned her master’s and doctoral degrees from DSU. Undergraduate research assistant Reina Girouard is a junior cyber operations major from Wyoming.
Lab director Dr. Arica Kulm said a research project was the best way to determine the legitimacy of concern with the app, so she tasked Campbell and undergraduate research assistant Reina Girouard with the project.
Exploring new technology like this is one of the roles of the digital forensics lab.
“This project demonstrates our commitment to thought leadership in cybersecurity — examining cybercrime, open-source intelligence and emerging technology to investigate complex threats,” said Dr. Ashley Podhradsky, vice president for research & economic development. “Under Dr. Arica Kulm’s leadership, the initiative continues to elevate our role as a trusted voice in the digital landscape.”
President José-Marie Griffiths said the project came about at her request.
“One of the roles we can play is to conduct a thorough cybersecurity investigation of questionable products. Most importantly, we can do so in a controlled lab environment,” she said, adding that similar investigations have been conducted by DSU’s MadLabs in the past couple of years.
The primary objective of the DeepSeek project was to explore the data acquisition capabilities of the app’s services, understand their potential impact on user data and examine the broader implications of these effects. The team looked at what the application does, who it’s reaching out to and checked on privacy implications and policies.
“We did find legitimate concerns and similarities to applications like TikTok,” Campbell said. “We found DeepSeek had security issues and data breaches, which included user information and chat conversations, so we questioned its security.”
The full results are outlined in a 27-page blog for the digital forensics lab webpage.
They found that a data breach did occur. Girouard pointed out that information was not publicly breached, yet other research articles confirmed obvious vulnerabilities, so the fact that a breach could have occurred was a major concern.
Other data privacy concerns included the fact that DeepSeek collects information such as location from the devices that access it. Because it’s a Chinese company, Campbell said, that information is subject to Chinese law, which brings potential data privacy concerns.
They also found examples of model bias. For example, questions about Taiwan were either shut down, or it would give an answer that aligned with Chinese ideals. Interestingly, this took place when on the web application, but if the model was downloaded and run offline, the response seemed more similar to an answer expected from a chatbot, Girouard said.
“Running locally, it didn’t censor the answer but was still biased to the Chinese perspective,” she said.
Responses to questions on historical events such as the Tiananmen Square protests and massacre in 1989, were surprising. Online, the app would begin a response, then erase it and finally put up a sentence saying that it couldn’t talk about that incident.
“Watching that in real time was kind of creepy,” Campbell said. Running locally, the output would be biased in some way through the app’s censors.
Attempts to replicate results were challenging. “It wasn’t consistent,” Girouard said. “We would try a prompt again to see if we could re-create it to document it, but we couldn’t, which was really interesting.”
The team’s final advice is simple.
“Stay away from DeepSeek,” Campbell said, but if someone does run this or any similar apps, “for data privacy and security, it’s definitely worth running a model like that locally,” but to expect some biases with data privacy and security, even when running locally.
The blog is available on the digital forensics lab website, along with others.






