WASHINGTON (November 24, 2025) – New research from The George Washington University’s Trustworthy AI Institute for Law and Society (TRAILS) reveals a striking disconnect between what leading AI firms say about “responsible AI” and what they actually do.
The study, “Do AI Chatbot Firms Practice What They Preach?” by Michael Moreno, a research associate, and Susan Ariel Aaronson, a research professor at GW who co-leads TRAILS, examines four major chatbots: ChatGPT (OpenAI), Gemini (Google), DeepSeek (a Chinese AI company), and Grok (xAI)
Using a mixed-methods approach, the researchers analyzed company websites, technical documentation, and direct chatbot interactions to see if these firms truly implement the responsible AI principles they publicly promote.
Key findings from their research include the following:
- Only Google and OpenAI mention responsible AI or similar language on their websites; xAI and DeepSeek make no reference to responsible AI.
- None of the four companies address on their websites how responsible AI shapes their approach to chatbot design, development and deployment, nor do they identify who is accountable for harms caused by their chatbots.
- Across technical documentation for three companies (Google, OpenAI and DeepSeek), responsible AI concepts account for just 0.004% of total word usage.
- In testing, chatbots gave vague or evasive answers about fairness, inclusivity, or democratic values — suggesting these principles aren’t part of their actual training.
The authors conclude that if policymakers and the public want to see responsible behavior, they will have to regulate or incentivize such behavior. The study, published in the most recent Proceedings of the American Association for the Advancement of AI, comes as the House Subcommittee on Oversight and Investigations held a hearing to investigate the safety of AI chatbots on November 18.
This study was funded in part by TRAILS, an AI research institute supported by the National Science Foundation and the National Institute of Standards and Technology.
To speak with the researchers about these findings please contact Skyler Sales at skylers
gwu [dot] edu.
-GW-