If you need to do the latter to be able to make money on the former, then you're not making money. Because if the latter requirement would disappear, inference margins would also drop.
At the end of the day, they're still burning cash. Even if inference is cheap, it's also not hard to compete on. They aren't going to be a trillion dollar
inference company.
Eventually there will be a race to the bottom on inference price to the customer by companies that aren't trying to subsidize their GPU investments.
OpenAI is spending money because they think they need to for their business to survive. They're hoping that the next big breakthrough just requires more compute and, somehow, that'll build them a moat.
OpenAI and quite honestly the others think they are in a race to AGI not the bottom. That's why they aren't concerning themselves with moats or cost. This is quite simply a massive bet that we've already cracked AGI and the rest is just funding the engineering to make it happen.
I personally think we haven't cracked AGI yet but it doesn't change their calculus.
OpenAI and others are already profitable on inference (inference is really really cheap)
They are just heavily investing into the latest frontier
The biggest risk is whether they can stay cutting edge, or if open source or others will catch up quickly.