They tried to kill it. To bury it for good. But every time, it clawed back — stronger, sharper, unstoppable. And once again, it is emerging as the most powerful programming language—not only for our traditional von Neumann architectures, nor merely for the current GPU computing race, but for something far more profound than we could have imagined: the optimizers that form the beating heart of modern AI.

But let’s not stop there. The latest C++ release wasn’t a facelift—it was a brain transplant. Whole strands of its 1980s DNA have been ripped out and replaced with something unprecedented.

One feature, barely noticed, carries an immense edge: monadic operations in C++23 — std::expected, std::optional, and std::variant working in concert. These aren’t just functional programming sugar. They’re the foundation for implementing a new kind of optimization calculus curvature , that is a radical departure from limit-based calculus that’s holding back modern optimization and AI advancement.

Curious about the hidden side of tech? Discover more on José Crespo, PhD's page

The Optimizer’s Midlife Crisis — Why C++23 Is Pure Gold

You experience it all the time, with frustration, when: Spotify looping the same 20 songs. Instagram pushing rage-bait. A Tesla missing a fire truck (hopefull not 😬). A model going NaN after 72 hours on $50k of GPUs. ChatGPT keeps drawing human hands with six fingers.

And here’s why: Different surfaces, same problem underneath. We’re using math from 1820s to run algorithms in 2025. And one programming language just figured out how to fix it.

Yep, right: the modern C++ (C++23). It gives you the tools to make your optimizer’s step use the terrain, not a one-size-fits-all rule.

But you still need the right math tool—and, surprisingly for many, classical calculus is the wrong one for the new breed of truly smarter optimizers.

Why? because unfortunately, our 19th-century calculus framework we learned is point-first: it asks what happens at a single point as the step goes to zero. That blinds you to the overall shape and is hard to automate when conditions keep changing, as real code moves in finite, noisy, floating-point steps.

We need a different math tool—an alternative to the older calculus—tuned for automated AI systems that drive billions of optimization updates per second, far beyond manual oversight.

Lucky for us, there’s a math wonder for this—Curvature Numbers— in contrast to the venerable but myiopic calculus, the curvature numbers see the whole shape projected in all its dimensions, instead of one point sliced one by one through every of its dimensions.

Picture it with a simple example:

The older calculus sees the trees before the forest, the new one based on true infinite small numbers sees both first the forest, then the trees and the leaves on the trees.

So, once again—as many times before in history—we need to switch tactics when social, economic, and technological needs demand new math.

How does C++ make that possible?

Strip away the hype, and most AI failures boil down to optimization—usually in two sour flavors:

NaN explosion after expensive training:

You either took a step that was too big for a sharp turn, or you landed on a bad spot in the computer’s number grid. Computers only have certain “number slots”; if your value gets too big or too tiny for those slots, the math goes weird (NaN), and the run crashes.

Copilot suggesting bugs:

The perfect irony — the model can learn the wrong pattern so well that it keeps doing the exact mistake you’re trying to fix. That’s an optimizer parking in brittle minima that look stable point-by-point, but collapse when you change context==.==

Now let’s see how to play with curvature numbers, look at the cows below.

This is the game-changer we’ve been waiting for: the moment optimizers can finally be called automatic optimizers — not fragile hacks that need constant manual tweaking whenever conditions shift.

And here lies the revolution: with curvature numbers as the math, and C++23 as the engine, the optimizer is no longer dependent on endless human tweaks. It becomes a self-governing system — continuously adapting, continuously correct, continuously compounding.

Why does this matter? Because optimizers are not just a detail of AI. They are the hidden machinery of every modern system:

That is the bottleneck. That is why full automation has stalled. Without true automatic optimizers, industries remain half-automated: machines that pretend to run themselves, but in reality still rely on constant human hand-tuning.

I hope you see now that this is not a minor upgrade. It is, without exaggeration, a civilizational tech threshold—arriving at a critical moment of growing skepticism toward AI. Curvature numbers in C++23 turn optimization itself into an automated process — the invisible hero no one mentions, but without it neither AI nor industry can be trusted to run fully automated.

C++: The Only Language That Can Exploit Curvature Numbers to the Max

Look at the chart below. It speaks for itself.

On the left: classical optimizers (SGD, Adam, L-BFGS). The workhorses of AI today. On the right: curvature numbers (ε*) in C++23.

The difference is rather mindblowing:

Old optimizers trip up because they use fixed step sizes, like marching forward blindfolded.

Back in the 1980s, the Barzilai–Borwein method proved you could do better by letting the step adapt to the slope of the terrain. “Curvature numbers” push this idea even further: instead of guessing, you use exact derivatives (via dual/jet numbers) and terrain-aware steps that fit the machine. And with C++23 tools like constexprmdspan, and expected, you can write these optimizers in a way that’s both fast and safe, right down at compile time.

And it doesn’t stop there: memory efficiency, noise robustness, second-order info, parallelism, theoretical guarantees — the curvature approach wins across the board. Every bar in green tells the same story: the old optimizers weren’t just outdated; they were bottlenecks throttling entire industries.

Are We Exaggerating? What About Other Languages?

The skeptic’s voice is predictable: “Sure, C++ was important in the past. But isn’t it just an old peer ready to be replaced by modern languages like Python, Rust, or Julia?”

Look at the chart below.

Press enter or click to view image in full size

The data answers for itself.

This is not nostalgia. It’s not about clinging to an old workhorse. It’s about performance reality. And in performance reality, C++23 isn’t just competitive — it’s miles ahead.

The Call to Ownership

C++ was not supposed to come back. It was supposed to die quietly while the world moved on to safer, simpler, slower tools. But history doesn’t move in straight lines — and here we are, with C++23 carrying the sharpest weapons our civilization has for real automation.

The lesson is simple: optimizers are not side details. They are the hidden engines of everything that adapts, learns, and runs at scale. Without them, AI is theater. With them, automation becomes reality. And now, for the first time, we hold a framework — curvature numbers, fused into C++23 — that makes optimization automatic.

So here is the challenge.
Don’t wait for Big Tech to bless you with “AI as a service.” Don’t wait for Python wrappers or Rust bindings to catch up. Don’t wait for the next hype cycle to tell you what matters.

Pick up the tools yourself. Write the kernels. Test the curvature numbers. Build the pipelines. and you will see by yourself that the full automatization is already here-

Take ownership. Open the compiler. The future is yours to build.