So basically
- LLM usefulness has plateau’d hard
- Sam A knows this and will pivot to capturing revenue now that the high growth phase is over - more restrictive rate limits, higher bills
They U-turned on the recent rate limit changes after the release backlash but more is coming
I think you could estimate the risk in micromorts from taking LSD from darkweb and find it’s not that different from other activities you do regularly (~multiple times a year, if not more)
You cannot estimate the risk of taking internet LSD, that is the problem.
There is also a history of cutting drugs with other substances to give them a unique effect which brings back return customers (i.e. cutting heroin with fentanyl).
And then there is the possibility that whoever produced the LSD screwed up and/or the end product is not pure.
> As an engineer, you should always be writing code with a absolutely minimal defect rate and well-understood capabilities.
I think the problem with the purists is that this is just a moral claim - it's not based on how businesses + marketplaces actually work. The lower you attempt to crank the defect rate (emphasis on the word "attempt"), the slower you will iterate. If you iterate too slow, you will be out-competed. End of discussion. This is as true in open-source as it is in enterprise SaaS. And in any case, you're just begging the question: how do we determine the "absolutely minimal" rate in advance?
> you can strive to write adaptable code that can accomodate those iterations.
This is a damaging myth that has wasted countless hours that could have otherwise been spent on fixing real, CURRENT problems - there is no such thing as writing "adaptable" code that can magically support future requirements BEFORE those requirements are known. If you were that good at predicting the future you would be a trader, not an engineer.
They U-turned on the recent rate limit changes after the release backlash but more is coming