Open AI is squeezing the balloon
How OpenAI's "Open" Models Are Actually a Cold-Hearted Business Move You Can Use As Well.
Six months ago, a Chinese startup called DeepSeek released something that changed everything: an open-source reasoning model that matched OpenAI's o1 performance at 95% lower cost, requiring just $5.6 million to train versus the hundreds of millions spent by U.S. competitors. Within days, DeepSeek's app hit #1 on the App Store, outranking ChatGPT. Nvidia lost $600 billion in market cap in a single day.
Then, exactly six months later in August 2025, OpenAI suddenly discovered their love for "openness" and released gpt-oss-120b and gpt-oss-20b—their first "open-weight" models since GPT-2. Sam Altman declared it part of their mission "to ensure AGI benefits all of humanity."
What you're witnessing isn't OpenAI's philosophical evolution. It's a textbook example of "squeezing the balloon" (read the book for more context!)—a calculated business move where you strategically commoditize one layer of your stack to protect and enlargen the layer that actually makes money.
I know this playbook because I've seen it executed perfectly before.
If you only have 5 minutes: here are the key points
OpenAI's release of open-weight models in August 2025 wasn't an act of generosity—it was a strategic move in response to DeepSeek's disruptive, low-cost open-source model.
This tactic mirrors Databricks' 2019 move to open-source Delta Lake, fragmenting the market to protect their more profitable managed services.
OpenAI's gpt-oss models aim to stall DeepSeek's momentum by flooding the developer ecosystem with alternatives, but they withhold key proprietary advantages (like reasoning capabilities and training insights).
The strategy is called "squeezing the balloon": commoditize the layer under threat to preserve and expand control over the profitable upper layers.
It's a calculated defense mechanism disguised as openness—a playbook every AI company should understand before betting on “open” models.
The databricks precedent: how to kill a standard
Rewind to 2017. I remember getting a message to have an in person chat with one of the founders of [tabular.io](
http://tabular.iohttps
://www.tabular.io/), which in turn got me deeper into “Apache Iceberg.”
Netflix created Apache Iceberg, an open table format designed to solve massive scaling problems with traditional data lakes. The engineering was brilliant—it addressed real pain points around metadata management and query performance that every large-scale analytics team faced.
By 2019, Iceberg was gaining serious momentum. Major cloud providers started adopting it. Companies like Apple, Bloomberg, and Adobe began building their data infrastructure around it. Iceberg was on track to become the standard for large-scale analytics, which meant everyone else's table formats would become irrelevant. The Iceberg founders from Netflix spun out their own start up (tabular.io) to pursue this idea as a serious business.
That's when Databricks made their move.
Instead of watching their Delta Lake format lose to an open standard, Databricks open-sourced Delta Lake in 2019. Not from sudden altruism, but because they understood a fundamental principle: if you can't control the standard, fragment the market.
By releasing their own "open" table format right as Iceberg gained momentum, Databricks split the ecosystem. Instead of one emerging standard, you suddenly had competing open formats, each with different vendor backing and technical trade-offs. The market fragmented exactly as intended.
Here's the sophisticated part: Databricks didn't give away their competitive advantage. Their core value proposition was never the table format—it was the managed platform, the optimized runtime, the integrated ML capabilities. By open-sourcing the table layer, they protected their platform business and pushed profits down into it while ensuring no single competitor could dominate the foundational standards.
The strategy worked beautifully. Today, instead of Iceberg ruling the world, we have a messy ecosystem where enterprises spend months evaluating Delta Lake vs Iceberg vs Hudi. And Databricks? They just acquired Tabular—the company behind Iceberg—for an undisclosed sum last year. Classic further squeezing the balloon.
OpenAI's balloon squeeze
Now watch OpenAI execute the exact same playbook with surgical precision.
DeepSeek didn't just create a cheaper model—they proved that world-class AI could be developed and distributed completely outside the closed, API-driven ecosystem that OpenAI has spent billions building. More importantly, they did it transparently: publishing their training methods, revealing their $5.6 million development cost, and open-sourcing everything under MIT license.
This created an existential threat to OpenAI's business model. If developers could get frontier-quality reasoning models for free, why pay $15 per million tokens for o1?
So OpenAI squeezed the balloon. Instead of watching DeepSeek capture the developer ecosystem, they released competing open models that fragment the market. Now developers can't simply standardize on DeepSeek R1—they have to evaluate multiple open options, each optimized for different use cases and deployment scenarios.
But look precisely at what OpenAI open-sourced versus what they protected:
They released basic model weights and Apache 2.0 licensing. They kept closed their training data, detailed architecture internals, and all reasoning model capabilities—o1, o3, and o4-mini remain API-only. They open-sourced the commodity layer while protecting everything that creates competitive advantage.
The gpt-oss models are good enough to keep developers in OpenAI's ecosystem and prevent wholesale migration to DeepSeek, but they're deliberately positioned as "developer tools" rather than production replacements for commercial APIs.
Perfect balloon squeeze: give away just enough to fragment the competition, protect everything that drives revenue, and push down the profits.
What this means
Companies building AI strategies around these "open" models should understand what they're actually getting. OpenAI's gpt-oss models integrate seamlessly with OpenAI's ecosystem, use OpenAI-optimized tooling, and benefit from OpenAI's deployment expertise. Switch to a truly independent alternative later, and you'll discover how much infrastructure assumed OpenAI's specific implementation choices.
For other AI companies, this should be required reading. The balloon squeeze is a powerful defensive strategy that can buy time, fragment markets, and protect core businesses while appearing generous.
The pattern is clear: when your core product faces open-source disruption, don't fight the trend—redirect it. Open-source something adjacent that fragments the market while keeping your actual competitive advantages proprietary.
DeepSeek forced OpenAI's hand, but OpenAI's response demonstrates why execution matters as much as innovation. They turned a defensive move into market positioning, maintaining their $150+ billion valuation while fracturing the open-source AI landscape.
It's a masterclass in competitive strategy that every tech company should study. Because the next time you see a major player suddenly embrace "openness," you'll know exactly what they're really protecting.