The Chicago Sun-Times and the Philadelphia Inquirer recently published stories with unidentifiable quotes from fake experts and imaginary book titles created by AI.
This content included a “Summer reading list for 2025” section which recommended made up books such as “Tidewater Dreams” by Isabel Allende as well as imaginary titles from authors Brit Bennett, Taylor Jenkins Reid, Min Jin Lee and Rebecca Makkai.
This mishap in reporting comes at a time when the media industry is reckoning with the boom in AI. Large language models and AI chatbots are known to occasionally provide incorrect or misleading information. Newspapers and freelance journalists that use AI tools without fact-checking may inadvertently publish misinformation.
The George Washington University has experts available who can discuss how to prevent this from happening in the future. If you would like to schedule an interview, please contact Claire Sabin at claire [dot] sabingwu [dot] edu (claire[dot]sabin[at]gwu[dot]edu).
Neil Johnson, a professor of physics at the George Washington University developed a mathematical formula to pinpoint the moment at which a “Jekyll-and-Hyde tipping point” occurs in AI. At the tipping point, AI’s attention has been stretched too thin and it starts pushing out misinformation and other negative content. In the future, Johnson says the model may pave the way toward solutions which would help keep AI trustworthy and prevent this tipping point.
David Karpf, associate professor of media and public affairs, focuses his work on strategic communication practices of political associations in the U.S., with a particular interest in Internet-related strategies. Two of his published books discuss how digital media is transforming the work of political advocacy and activist organizations. Karpf is an expert on AI, internet politics, and political communication.
-GW-