Evasive C-suite execs can now add AI to the list of those scrutinizing their earnings calls.
A study from researchers at Germany’s University of Münster and Washington University in St. Louis found that analysts can use large language models to infer when executives aren’t being forthcoming on earnings calls, reported London-based investment strategist Joachim Klement.
During the study, the researchers fed earnings call transcripts into OpenAI’s GPT-4 Turbo to gauge whether an executive’s comments were “usual” or “unusual.” When the AI identifies an unusual earnings call transcript, it will lay out its reasoning, Klement explains.
In one example, the AI analyzed the transcript and flagged that management was possibly avoiding specifics, despite being pressed by analysts, according to Klement. The AI flagged “unusualness” most often when speakers offered lengthy responses and long discussions about non-financial topics, some of the most common ways execs deflect negative news.
Since the launch of ChatGPT in 2022, the finance industry has increasingly incorporated AI into more of its day-to-day operations. Earlier this year JPMorgan Chase unveiled an AI tool to help interpret the ambiguous “Fedspeak” spouted by Chairman Jerome Powell. Still, the results of AI making a significant difference in stock picking is yet unproven.
Ultimately, the researchers believe AI could help analysts predict the market, according to the researchers by getting a better handle on earnings call
“The stock market reacts negatively to unusual financial communication, with an elevated trading activity,” the researchers wrote in the study’s abstract. “This response is exacerbated when more dimensions of unusual communication are identified for a firm.”
Klement points out that with the introduction of OpenAI’s GPT-4o in May, analysts could also upload the earnings call audio to the LLM or deploy it during the livestream.
While analysts often pick up on unusual comments themselves, the technology will allow them to analyze multiple earnings calls at once across industries.
“Personally, I love it, but as a corporate executive I probably wouldn’t because chatGPT makes it much harder to get away with deception and distraction,” Klement wrote.