Hacker Newsnew | past | comments | ask | show | jobs | submit | more jerf's commentslogin

"Suddenly" in this case does not mean tomorrow.

It means that, today, a lot of enterprises begin pondering the question, and then about a year from now, they start seriously studying and prototyping it, and then "suddenly" in 2029 Microsoft starts seeing a deluge of defections. It means a whole bunch of peopling finishing the conversion all at once, relatively speaking, even if that "all at once" is 3-4 years away.

To put it another way, the thresholds where people get annoyed enough to quit are highly correlated to each other. If individuals on HN are posting "I don't want to switch, I've been working this way for decades now, but Windows has crossed the line for me, I've switched to Linux, and it was easier than I thought it would be", then corporations and governments are having very similar deliberations internally.

This is probably a more accurate model for how "influencers" seem to work than the idea that some crazy guy in your organization falls in love with Product X and evangelizes it internally. I'm sure that happens and is a real force, but this correlation-of-experience effect is probably bigger on the whole. If Product X was good enough to make an evangelist internally, or more germane to this conversation, to make some a mortal enemy of it internally, it's usually because it was a good enough or bad enough product to be able to do that in the first place, and eventually everyone will figure it out in exactly the same way, just later.

20 years is way too large a minimum estimate. If Microsoft responds correctly that might be good, but if they just decide to rest on their laurels and extract whatever value they can out of Windows while they can, Windows would never last 20 years of that. Even the slowest organizations can move faster than that. After all, to cut Microsoft's revenues off at the knees, they don't need to remove every last Windows 2000 server in their backoffice they can't upgrade, they need to cut out just the majority of desktop licenses.


> lot of enterprises begin pondering the question, and then about a year from now, they start seriously studying and prototyping it

Not sure about big enterprises, but I already see this happening in the mid-size, non tech company market.

I'm an IT manager and has been a sysadmin/ops for my entire career, and the past ~4 years I've been seeing a pretty consistent shift toward companies my company does business with deploying more and more macs. Windows is still dominant in my industry, but the cracks in the wall are widening. It's gotten to the point that I'm genuinely surprised now when I see Windows when someone screen shares.

Apple silicon is just too good and the generations coming into the workforce now don't have a "default" windows familiarity that we used to have. They're coming in needing to be trained on how to use a PC in general, windows or not, having used nothing but chromebooks and mobile OSes.

Now, Office OTOH is more entrenched than windows. Even the macshops I interact with are all on M365. Macs are managed with Intune, users & SSO with Entra, Defender for EDR, and of course the office apps. And that's why Microsoft probably isn't as afraid as it seems when it comes to Windows. Even without Windows lock-in, there is very real M365 lockin that is far more entrenched than the endpoint OS.


>20 years is way too large a minimum estimate.

i disagree. unless intuit is also rewriting quickbooks, dassault systèmes is rewriting solidworks, every bank is rewriting their custom windows-only software, every government branch is rewriting their custom windows-only software, etc. and every company is willing to retrain 95% of their employees on a new operating system, have increased support requirements for a few years at least, etc.

not even touching the capital required for such a transition that in many cases has questionable benefits (from a business perspective).

time will tell! i have first-hand experience with how fast banks move, so i will stick by my 20 year minimum. happy to see otherwise, though.

in any case. what i replied to was a claim that windows is in "significant danger" today. it is not.


> unless intuit is also rewriting quickbooks

They already have. You can't buy QuickBooks for desktop anymore unless you want Enterprise, the expensive $4k+/year subscription. They dumped the Pro/Pro Plus and moved all those users to QuickBooks online.

And now they've launched Intuit Enterprise Suite in an effort to move the QBE customers into Online. The writing is on the wall there, desktop is going away.

It's also happening in more specialized areas too. I work in waste management/recycling, and this industry was traditionally windows heavy with thick clients on desktops. Even the truck scale software is moving to web interfaces, as are the dispatching and asset management.

OS increasingly doesn't matter for most knowledge work.

Yeah, there are going to be industries that will probably never move, certainly not within a 20 year timeline, but there are a ton that are moving or have moved entirely to SaaS and web apps.


In 20 years I expect basically all of these to move to web-based interfaces and away from thick clients. You're already seeing graphics heavy use cases like CAD do this (Onshape has been hugely popular and is cloud native on Linux). Even behemoths like SAP are increasingly web enabled through fiori.


it would be awesome to see less windows-only software. i am all for it.


It's an interesting case to me. The company I work for has been shipping systems on windows since the 90's despite pretty consistent requests from customers to ship hardware on Linux. 2 years ago we started creating our own Linux distribution and this year started shipping products on it. We still ship a lot of stuff on Windows 11, but that market share is starting to shift now. 10 years from now I could see us completely moved to our Linux distro. Now, what's actually interesting is that it wasn't customer requests or efficient capital allocation that drove this. Microsoft effectively forced us to do this against our will by a combination discontinued products and handling of Windows 11 and now that we've spent the capital we won't be going back.


To me, this is the way linux wins, if it does.

Product teams deciding it's easier to ship on + customers having enough linux familiarity (from their other projects).

And the current crop of Microsoft people on the Windows team don't seem to understand building a platform in the way 90/00s Windows teams did.

It's clear MS moved a lot of their smartest people over to work on Azure products.


You can't abandon Windows because of software X, Y, Z. Over the years vendors move to multiplatform as more and more customers ask for it. These changes are slow but steady. And one day you find out that the last "must have" software is not limited to Windows anymore. That's when the dam breaks.


Out of curiosity why a custom distro instead of one of the major ones?


We have some special cases I can't go too much into that lend themselves well to rolling our own and stripping things down as much as possible.


> i disagree. unless intuit is also rewriting quickbooks, dassault systèmes is rewriting solidworks, every bank is rewriting their custom windows-only software, every government branch is rewriting their custom windows-only software

Up front they won't need to do a full rewrite. They'll only need to make it work well enough under Wine.

At a source level, tools like Avalonia's xpf make porting WPF apps to other platforms easier:

https://avaloniaui.net/xpf


of the stupid enterprise-y software like quickbooks, solidworks and other proprietary stuff that i have used, they barely work well enough under native windows. not to mention, even sticking them in a windows VM voids any support contracts.


> i disagree. unless intuit is also rewriting quickbooks, dassault systèmes is rewriting solidworks

Autodesk Fusion runs in browser, and even SolidWorks has an online version. You were saying?...

> every bank is rewriting their custom windows-only software

This has been happening for a while, actually. Typically they rewrite software as webapps, with a microservice-based backend.

> not even touching the capital required for such a transition that in many cases has questionable benefits (from a business perspective).

Software gets rewritten all the time, often in the "Ship of Theseus" way. The next rewrite will just focus on moving stuff away from MS.


> Autodesk Fusion runs in browser, and even SolidWorks has an online version. You were saying?..

Mostly used for visualisation, rather than real work, especially given the browser limitations.


It doesn't take all the specialized Windows-only line-of-business software being rewritten to have massive defections to Linux happen.

The market you're describing is real, and very significant—but I don't think it's even a majority of Windows users. If so, it's a small one.

And imagine what even 30-40% of all Windows sales disappearing over the course of 2-3 years would do to Microsoft. To Windows as a platform.

Then imagine what would happen if it was 50-70%.

The former, I would describe as "a disaster".

The latter, I would describe as "apocalyptic". (Y'know. For Microsoft as a company. Not in general.)


The story may be posted today but there's no reason it has to be a recent story. Even the most backward government post in 2026 should have a fax-to-document service that integrates with their document tracker. But there was definitely a 15 to 20 year window from in the 1990s to somewhere in the 2010s where you could send faxes directly from a document one way or another but the recipient was almost certain to be dumping them straight to paper. The story mentions using an internet service which I am not sure would have existed in the 90s (maybe at the very end), but I extend the essence of the story back to the 90s because I remember having a modem that had a printer driver that allowed you to hit "print" and fax someone directly, which you could also easily use to do something like this without any sort of step where you're feeding paper into a physical machine.

Faxes have been "obsolete" a really long time.


I don't have a word for this, but this falls under the class of things where even if the author who wrote this is did not personally do this and is making it up, it has absolutely 100% happened somewhere, many times over.

For example, it's the same for the DailyWTF... I remember how that would be posted here or on a programming reddit and half the comments would be about how it hadn't happened, and you know, maybe whoever wrote those particular words is just making it up, but I've seen enough just in my little tiny slice of human behavior phase space to know that either the story or something indistinguishably close to it most certainly has happened somewhere, at some time.


Representative? Believe that or “representation” is used in marketing, like for renders.

This cannot be the best word…


In this scenario, that would be the people paying for the assassination. The people who want it to happen bet that it won't. The people who want to do it bet that it will. The net result is that if one of the people who bet on it happening makes it happen, they are being paid by the people betting against it, in a plausibly deniable way.

A country leader seeing someone suddenly take out a $50 million position on them not being assassinated is not the $50 million vote of confidence a naive read on the market might indicate, it's a $50 million payout to the assassin. Albeit inefficiently so, since others can take the other side of the bet and do nothing. But the deniability may be worth it.


What's even more interesting is when you consider that A) it doesn't have to be one person taking out a large position, it can be multiple people, over time, and B) the assassin doesn't have to be known or confirmed ahead of time, if someone decides their "reserve price" has been met, all they have to do to receive a payout is place the appropriate bet before performing the act.

The end result is a combination of Kickstarter and Doordash for targeted homicide.


> The end result is a combination of Kickstarter and Doordash for targeted homicide.

or kidnappers. Someone could take the opposite side, kidnap the individual and guarantee their survival for the year. When time is up they just dump them in the street and collect the bet.


I'm not sure there's any deniability in placing the "won't be assassinated" bet, when you could equally state it as "I will pay $1M to whoever accepts this bet and assassinated this person"

Anyways, how exactly is this assassin going to collect on their bet? I'm pretty sure law enforcement will be looking into the fact that somebody place that bet and then shortly after, the assassination happened.


This could make for fun anti-life insurance.

"I bet I won't die this year."

The only life insurance you get to collect on while you're alive.


"I've gone back and forth internally about whether this is healthy or not for him. I truly don't know."

On a psychological level, I don't know either. I have opinions but they haven't aged long enough for me to trust them, and AI is a moving target on the sort of time frame I'm thinking here.

However, as a sort of tiebreaker, I can guarantee that one way or another this relationship will eventually be abused one way or another by whoever owns the AI. Not necessarily in a Hollywood-esque "turn them into a hypnotized secret assassin" sort of abuse (although I'm not sure that's entirely off the table...), but think more like highly-targeted advertising and just generally taking advantage of being able to direct attention and money to the advantage of another party.

Whether or not AI in the abstract can "be your friend", in the real world we live in an AI controlled by someone else definitely can not be your friend in the general sense we mean, because there is this "third party", the AI owner, whose interests are being represented in the relationship. And whatever that may look like in practice, whoever from the 22nd century may be looking back at this message as they analyze the data of the past in a world where "AI friendships" are routine and their use of the word now comfortably encompasses that relationship, that simply isn't the sort of relationship we'd call a "friend" in the here and now, because a friend relationship is only between two entities.


Go's net/http Client is built for functionality and complete support of the protocol, including even such corner cases as support for trailer headers: https://developer.mozilla.org/en-US/docs/Web/HTTP/Reference/... Which for a lot of people reading this message is probably the first time they've heard of this.

It is not built for convenience. It has no methods for simply posting JSON, or marshaling a JSON response from a body automatically, no "fluent" interface, no automatic method for dealing with querystring parameters in a URL, no direct integration with any particular authentication/authorization scheme (other than Basic Authentication, which is part of the protocol). It only accepts streams for request bodys and only yields streams for response bodies, and while this is absolutely correct for a low-level library and any "request" library that mandates strings with no ability to stream in either direction is objectively wrong, it is a rather nice feature to have available when you know the request or response is going to be small. And so on and so on.

There's a lot of libraries you can grab that will fix this, if you care, everything from clones of the request library, to libraries designed explicitly to handle scraping cases, and so on. And that is in some sense also exactly why the net/http client is designed the way it is. It's designed to be in the standard library, where it can be indefinitely supported because it just reflects the protocol as directly as possible, and whatever whims of fate or fashion roll through the developer community as to the best way to make web requests may be now or in the future, those things can build on the solid foundation of net/http's Request and Response values.

Python is in fact a pretty good demonstration of the risks of trying to go too "high level" in such a client in the standard library.


I'm at a loss as to how some of these projects got funded in the first place. Anyone funding these should have had the perspective to see that there isn't enough power for them. Anyone funding them should have had the perspective to see that by the time power could come online for even a significant fraction of them, the depreciation and interest costs should have murdered the company trying to do it, especially if their solution to that problem is the oh-so-21st century solution of "solving" the problem of losing money by levering up. It does no good to go out of business entirely in 2027 to make the phat buxx in 2030, which seems to be the best case scenario for this space as a whole.

The other question I have is... who exactly is doing all of 1. Using AI right now 2. Making substantial money on it or getting real value and 3. Capacity constrained? Who is actually going to productively soak up all this capacity? It seems to me that bringing all this stuff online can't really make things much cheaper than they are now because the fixed costs aren't going anywhere, and if anything, trying to jam so many projects through all at once just raises those fixed costs even higher. It's not like they triple data center capacity (and increasing AI capacity by, what, 10x? 20x?), stick them full of AI systems, and into that 10x+ greater AI capacity they can sell it at the prices they are now. Higher capacity would crash the selling price but the costs would be as high or higher than now.

I am at a complete loss as to how the numbers are supposed to work here. You can't build a company in 2026 on the economy and tech infrastructure of 2036 anymore than it worked to build a company in 1999 on the economy and tech infrastructure of 2019, no matter how rosy the numbers look on the projections based on conveniently ignoring the fact the company passes through "death" in a year and half. Everything promised in 1999 happened, but trying to artificially accelerate it onto Wall Street's time line burned money by the billions. I'm sure 2036 will have lots of AI in it, but you can't just spend money to bring it forward 10 years by sheer force of will. It has to happen at its own pace.


> The other question I have is... who exactly is doing all of 1. Using AI right now 2. Making substantial money on it or getting real value and 3. Capacity constrained?

Almost all enterprise users for one. At least from what I have seen it is a massive productivity boost for coding and general research. If the costs were ~4x lower, we would be able to do much much more with them. Building datacenters will reduce the cost because increasing supply would reduce the cost.

> It's not like they triple data center capacity (and increasing AI capacity by, what, 10x? 20x?), stick them full of AI systems, and into that 10x+ greater AI capacity they can sell it at the prices they are now. Higher capacity would crash the selling price but the costs would be as high or higher than now.

This is false. Part of the costs are unit costs which are really high margin. I think the margins are around 50% to 60%. By increasing the capacity, the are bound to make even more profit.

But the other part is reflecting the lack of capacity.


"Building datacenters will reduce the cost because increasing supply would reduce the cost."

That's great for us users but I'm talking from the point of view of the people trying to make money on the data centers.

"This is false. Part of the costs are unit costs which are really high margin."

Can you explain how everybody throwing their money at nVidia lowers the costs? When they are already apparently at max capacity?

Everybody trying to build a data center at once raises the costs of the data center. Everyone competing for power has already raised power prices and we've barely begun bringing this stuff online. Everyone demanding multiples of what nVidia is producing means nVidias isn't going to reduce prices any time soon.

Your use of "even more profit" also implies that you think that the AI world is making lots of money? nVidia is making lots of money. To a first approximation, everybody else involved has lost billions. Maybe not Apple. But everyone else you can name is deep in the negative on AI.


> To a first approximation, everybody else involved has lost billions.

"Lost" implies they have nothing to show for it. But they do. Depending on who you're looking at, they have data centers, GPUs, as well as billions in revenue and hundreds of millions of users, both rapidly growing. We can't say anything is "lost" because these are investments, and will only be sunk costs if nobody ever makes money.

But people are already making money. The big names are in a growth stage, so their spending is far outpacing their returns, but if you look beyond, people are making a ton of money on AI, which bodes well for these investments. Some data points:

1. AI startups are growing revenue at a record pace, as confirmed by three separate groups adjacent to them -- investors, enterprise purchase decision makers, and Stripe (which processes their payments): https://news.ycombinator.com/item?id=46730182

2. AI is creating a boom in mobile apps, including a surge in revenue -- https://techcrunch.com/2026/01/21/consumers-spent-more-on-mo...

So much that Apple made a billion more just from their App Store cut: https://www.macrumors.com/2026/03/20/apple-made-nearly-900m-...

3. AI agents boosting holiday sales: https://www.salesforce.com/news/stories/2025-holiday-shoppin...

Keep in mind we are only ~3 years since ChatGPT kicked this whole thing off


> That's great for us users but I'm talking from the point of view of the people trying to make money on the data centers.

Why wouldn't they make money if they are the ones on whom money is thrown at?

> Can you explain how everybody throwing their money at nVidia lowers the costs? When they are already apparently at max capacity?

Increasing supply lowers the cost, I'm unsure which part of this is surprising.

> Your use of "even more profit" also implies that you think that the AI world is making lots of money? nVidia is making lots of money. To a first approximation, everybody else involved has lost billions. Maybe not Apple. But everyone else you can name is deep in the negative on AI.

The companies using AI are making money out of it. OpenAI will make money in the future but are losing it because of R&D and training.


> At least from what I have seen it is a massive productivity boost for coding and general research

Are companies release more software with less developers? If the answer is no, then the productivity has not improved. It might SEEM like it improves because you're able to produce more code and you spend less time programming, but that might not be the case in actuality.

From what I've seen, AI is very good and very popular but it hasn't improved programming productivity in a meaningful way. The bottlenecks are unchanged so writing more code faster doesn't help anything. A lot of companies let a lot of employees go due to AI, and their product velocity has noticably gone down and their quality is noticably worse.


In xAI's case, they've gotten gas turbines installed on site with which to make up the electricity generation shortfall onsite. It's unclear exactly how long that short term solution is going to be there, but probably quite a while.


Now is probably a pretty good time to start a capabilities-based language if someone is able to do that. I wish I had the time.


The primary alternatives are:

One, you don't need this. The vast majority of people working on the web are now so thoroughly overserved by their frameworks, especially the way that benchmarks like this measured only the minimal overhead the frameworks could impose, that measuring your framework on how many nanoseconds per request it consumes (I think time per request is a more sensible measure than request per time) is quintessential premature optimization. All consulting a table like this does for the vast majority of people is pessimize their framework choices by slanting them in the direction of taking speed over features when in fact they are better served by taking features over speed.

Two, you are performance bound, in which case, these benchmarks still don't help very much, because you really just have to stub out your performance and run benchmarks yourself, because you need to holistically analyze the performance of your framework, with your database, with any other APIs or libraries you use, to know what is going to be the globally best solution. Granted, not starting with a framework that struggles to attain 100 requests per second can help, but if you're in this position and you can't identify that sort of thing within minutes of scanning their documentation you're boned anyhow. They're not really that common anymore.

This sort of benchmark ranges from "just barely positive" value to a significant hazard of being substantially negative if you aren't very, very careful how you use the information.

Framework qua framework choice doesn't matter much anymore. It's dominated by so, so many other considerations, as long as you don't take the real stinkers.


There's a very wide band between "2G" and "unlimited" to explore.

Cell phone systems already have some tiering built in, at least based on the fine print I've read about my plans. Once I run out of "official data" I fall back to low-priority usage, but the cell system is generally so well-provisioned nowadays that I hardly notice. In 2026, one must take explicit action to force people back to 2G. Nothing would stop these plans from, say, simply always being "low priority usage" but at full speed, and for the most part this would satisfy everyone.

This sort of clause reeks of "it was written into a contract 15 years ago and nobody has even so much as thought about it since then" rather than some sort of choice.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: