NIXSOLUTIONS: Anthropic Trained Claude 3.7 at Far Lower Costs Than Competitors

Anthropic’s latest AI model, Claude 3.7 Sonnet, required only “tens of millions of dollars” to train and used less than 10¹⁴ teraflops of computing power. This was reported by Ethan Mollick, a professor at the Wharton School of Business, who referenced information from Anthropic’s public relations department. “Anthropic contacted me and said that Sonnet 3.7 should not be considered a 10²⁶ Flops model, and it only cost a few tens of millions of dollars,” Mollick stated. He also noted that future models would be significantly larger.

NIXSolutions

Declining AI Training Costs

If the training cost of Claude 3.7 Sonnet was indeed only in the tens of millions, it suggests a downward trend in AI development expenses. Its predecessor, Claude 3.5 Sonnet, reportedly had a similar cost, according to Anthropic CEO Dario Amodei. In contrast, OpenAI spent $100 million to develop GPT-4, while Google’s Gemini Ultra training cost an estimated $200 million. Yet, Amodei does not foresee a lasting reduction in AI training costs, predicting future models will require billions of dollars—excluding security testing and fundamental research expenses.

Despite the relatively lower costs for Claude 3.7 Sonnet, the growing demand for AI with advanced reasoning capabilities means future models will require more computing resources, notes NIXSOLUTIONS. As AI evolves, the financial and computational demands will continue to rise. We’ll keep you updated as these advancements unfold.