OpenAI’s Latest AI Can Cost More Than $1,000 Per Query

OpenAI's Latest AI Can Cost More Than $1,000 Per Query


Brainpower, at an extreme premium.

Money to Burn

OpenAI’s recently unveiled o3 model is purportedly its most powerful AI yet, but with one big drawback: it costs ungodly sums of money to run, TechCrunch reports.

Announced just over a week ago, o3 “reasons” through problems using a technique known as test-time compute — as in, it takes more time to “think” and explore multiple possibilities before spitting out an answer. As such, OpenAI engineers hope that the AI model will produce better responses to complex prompts instead of jumping to a faulty conclusion.

It appears to have worked, at least to some degree. In its most powerful “high-compute mode,” o3 scored 87.5 percent on the ARC-AGI benchmark designed to test language models, according to the test’s creator François Chollet. That’s nearly three times as high as the previous o1 model’s best score, at just 32 percent.

All that fastidious thinking, however, comes with exorbitant expenses. To achieve that high-water mark, o3 used well over $1,000 of computing power per task — over 170 times more compute than a low-power version of o3, and leagues beyond its predecessor, which cost less than $4 per task.

Wall to Wall

These costs complicate the industry’s claims that o3’s performance soundly debunks fears that improving AI models through “scaling,” or by furnishing them with more processing power and training data, has hit a wall.

On the one hand, that o3 scored nearly three times higher than o1, which was released just three months ago, seems ample evidence that AI gains aren’t slowing down.

But the criticism with scaling is that it yields diminishing returns. While the gains here were in large part achieved through changing how the AI model “reasons” instead of scaling alone, the added costs are difficult to ignore.

Even the low-compute version of o3, which scored a still breakthrough-worthy 76 percent on the benchmark, cost around $20 per task. That’s a relative bargain, but still many times more expensive than its predecessors — and with ChatGPT Plus costing just $25 per month, it’s not clear how much smarter that user-facing product will be able to get without putting OpenAI deeply in the red.

High Salary

In a blog post explaining the benchmark results, Chollet asserts that though o3 is approaching human levels of performance, it “comes at a steep cost, and wouldn’t quite be economical yet.”

“You could pay a human to solve ARC-AGI tasks for roughly $5 per task (we know, we did that),” he wrote, “while consuming mere cents in energy.”

He is adamant, however, that “cost-performance will likely improve quite dramatically over the next few months and years.”

And to that, we’ll just have to wait and see. Right now, o3 isn’t available to public yet, with a “mini” version of it slated to launch in January.

More on AI: Facebook Planning to Flood Platform with AI-Powered Users



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.