Rapid Read    •   8 min read

Study Reveals Open-Sourced AI Models May Be More Costly Due to Higher Computing Needs

WHAT'S THE STORY?

What's Happening?

A recent study conducted by Nous Research has highlighted the potential hidden costs associated with open-sourced AI models. While these models may appear cheaper initially, they require significantly more computing resources compared to closed-source models. The study tested various AI models, including those from Google and OpenAI, as well as open-source models from DeepSeek and Magistral, across tasks such as simple knowledge questions, math problems, and logic puzzles. The researchers measured the computing effort by counting the number of tokens each model used to complete tasks. Open-sourced models were found to use 1.5 to 4 times more tokens than closed-source models, and up to 10 times more for simple knowledge questions, which can lead to higher costs per query despite lower per-token costs.
AD

Why It's Important?

The findings of this study are significant for businesses considering the adoption of AI technologies. While open-sourced models may offer initial cost savings, the increased token usage can lead to longer generation times and increased latency, potentially offsetting any cost advantages. This has implications for companies that rely on AI for efficiency and cost-effectiveness, as the choice between open and closed models could impact operational costs and performance. Closed-source models, which optimize for fewer tokens, may offer better cost efficiency and faster processing times, making them more attractive for businesses seeking to minimize expenses and maximize productivity.

What's Next?

As businesses weigh the pros and cons of open versus closed AI models, the study suggests that companies may need to consider the long-term implications of token efficiency on their operations. OpenAI's models, noted for their token efficiency, could serve as benchmarks for improving open-source models. Companies might explore strategies to enhance token efficiency in open models to reduce costs and improve performance. Additionally, the study may prompt further research into optimizing AI models for better resource management, potentially influencing future developments in AI technology.

Beyond the Headlines

The study raises broader questions about the transparency and efficiency of AI models. Open-source models, while offering more visibility into their reasoning processes, may need to balance this with the need for efficiency. The findings could lead to discussions on the ethical considerations of AI model development, particularly in terms of resource consumption and environmental impact. As AI becomes more integrated into various industries, the need for sustainable and efficient models will likely become a priority.

AI Generated Content

AD
More Stories You Might Enjoy