I am U-Zyn Chua. I build, research and write about technology, AI and the open web.
Posted on :: Tags: , ,

It must have been be a challenging week for the folks at OpenAI and other similar closed-source foundational AI model companies. The release of DeepSeek, which is not only better than OpenAI's o1 model but also free, has sent shockwaves through the industry. The fact that DeepSeek has managed to achieve this feat with only a fraction of the resources that it took for other comparable models to be trained is even more surprising.

Foundational closed-source AI companies are usually making a loss on the consumer products through the likes of ChatGPT Plus, Claude Pro. It is also commonly believed that these consumer subscription-based AI products do not bring in much profits. It is not surprising to find that some of these subscription plans are even loss-making, such as that of $200 a month of ChatGPT Pro plan.

The foundational AI companies are still figuring out ways to turn profitable. The revenue model today is largely relying on the API and enterprise sales, i.e. the "AI wrappers" and existing apps racing to support AI feature.

Sam Altman: The honest answer is we have no idea [on how to bring in revenue]. We have never made any revenue. We have no current plans to make revenue. We have no idea how we may one day generate revenue. We have made a soft promise to investors that once we've built this sort of generally intelligent system, basically we will ask it to figure out a way to generate an investment return for you. [The Journal by WSJ]

While we, fellow AI enthusiasts, are excited seeing newer models outdoing one another every other day, the reality is that these models are all based on the same foundational API. This makes the switching cost for AI models for API users is practically close to $0. With DeepSeek's API reportedly costing less than 5% of OpenAI's o1. The costs can potentially be even lower if you run the open source model on your own servers, reducing it to only hosting costs. Memories, fine-tuning (LoRA and the likes), and retrieval-augmented generation (RAG) applications can be ported to competing models with relative ease, in fact more so with open source models than the closed-source counterparts.

DeepSeek proposed a cut-price fee offering for accessing and using DeepSeek-R1, at 16 yuan ($2.20) per million tokens, considerably less than OpenAI's o1 438 yuan ($60.00) for the same usage. [Reuters]

It is going to be really challenging for foundational AI models to figure out a way to create stickiness or introduce switching costs, artificial or otherwise, similar to how it is for cloud operators, especially email and office suite providers. A Google Workspace user will not be switching just because a competitor is offering email services at 50% discount, even if the competitor provides 100% of all the services available at Google Workspace. Technical and data migration pain is sticky, for better or for worse.