Disclosing AI use can backfire, research shows

Over 13 experiments involving over 5,000 participants the results were consistent – if you're honest about using AI, people tend to trust you less.
Transparency usually builds trust at work, in class and at home. But when it comes to using generative artificial intelligence, the opposite may be true. According to new research from the University of Arizona Eller College of Management, being honest about using AI can actually make people trust you less.
Martin Reimann, associate professor of marketing, and Oliver Schilke, professor of management and organizations, conducted 13 experiments involving more than 5,000 participants and the results were consistent: Revealing AI use led to a drop in trust.
"In each experiment, we found that, when someone disclosed using AI, trust declined significantly," Schilke said. "And this held across different types of tasks and evaluator groups."

Oliver Schilke
Reimann and Schilke examined several instances where people might use AI, from an instructor using the technology for grading purposes to a job applicant acknowledging that they used AI to write a cover letter. In each case, those who disclosed AI use were trusted less. For example:
- Trust from students dropped 16% when they learned a professor used AI for grading.
- Investors trusted firms 18% less when ads disclosed AI use.
- Clients placed 20% less trust in graphic designers after AI disclosure.
Reimann said they collected data on how familiar respondents were with AI and how often they used the technology.
"Even with people who were very familiar with AI and used it frequently, the erosion of trust was there," Reimann said. "You still observe the effect."
An outlier in transparency and trust
Reimann and Schilke, who center much of their research on trust, note that conventional wisdom suggests people should be transparent to build credibility. But their new research suggests what you are transparent about also matters.

Martin Reimann
"If you're transparent about something that reflects negatively on you, the trust benefit you get might be overshadowed by the penalty for what you revealed," Schilke said. "There's a trade-off."
Because some may view the use of AI as a shortcut, the researchers wanted to see if different ways of disclosing AI use could lessen the effect. They experimented with using gentler language, such as saying AI was only used for proofreading or that a human reviewed the AI's output. In all cases, trust levels still declined.
Even with the decrease in trust in self-disclosing use of the technology, there is a worse scenario: getting "caught" using AI.
"Trust drops even further if somebody else exposes you after using an AI detector or finding out about it some other way," Schilke said. "If a third party goes in and shares that you used AI, that's the worst possible outcome as far as trust is concerned."
Implications in the workplace
As generative AI becomes more common, companies must decide how to navigate its use and disclosure, especially in fields like education, health care and finance, where trust is essential. Reimann and Schilke add that trust erosion often reaches beyond the individual level and can damage a team's cohesion or a brand's credibility.
"Organizations need to decide whether to make disclosure policies mandatory or voluntary, and prepare employees for the trust implications either way," Reimann said.
With AI use growing, the researchers recommend cultivating a workplace environment where AI use is seen as normal and legitimate.
What comes next
Artificial intelligence is evolving quickly. As the technology becomes more widespread and better understood, the cost for disclosing its use may fade over time. However, other trust issues may develop, Reimann said. With some advanced AI platforms charging up to $200 per month, an access gap may develop between users who can afford that cost and those who have to go with a cheaper or free version, he explained.
"I personally find this technology amazing," Reimann said. "But it's not just about what AI can do, it's about how it impacts human relationships."