Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> A decade later, there are no viable alternatives to solve that instead of the costly replacement of hardware with more expensive hardware.

I mean, that's the root of scaling as a principle, right?

You could viably start training an AI on your cell phone. It would be completely useless, lack meaningful parameter saturation and take months to reach an inferencing checkpoint, but you could do it. Nvidia is offering a similar system to people, but at a scale that doesn't suck like a cellphone does. Businesses can then choose how much power they need on-site, or rent it from a cloud provider.

If a product like this convinces some customers to ditch older and less efficient training silicon, I don't see how it's any more antagonistic than other CPU designers with perennial product updates.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: