Hacker Newsnew | past | comments | ask | show | jobs | submit | architect64's commentslogin

One issue to watch out for: Sub-4K res monitors look surprisingly bad on newer versions of macOS with Apple Silicon Macs. And no, it's not simply a matter of non-Retina obviously not looking as nice as Retina monitors - something like a 1440p monitor will look much worse on macOS than it would on Windows or Linux. This is partly caused by a lack of subpixel rendering for text on macOS, but it doesn't affect just text, with app icon graphics and such seemingly optimized for High-DPI resolutions only and thus looking awful too. You commonly see people using 3rd party apps such as BetterDisplay to partially work around this problem by tricking the system to treat 1440p displays as 5K displays and then downscale, but it doesn't solve this completely. So yes, the price for the machine is fantastic, but you may want to budget for a basic 4K display as well.


> you may want to budget for a basic 4K display as well

Best investment you’ll ever make. They’re not all that expensive. Having experienced 4k I feel impoverished having to return to lower resolutions.

I feel it’s a travesty that workplaces spend thousands on fancy desks and chairs and cheap out on bargain basement monitors.


> Having experienced 4k I feel impoverished having to return to lower resolutions.

That's what they said. I've been using Retina/HiDPI displays at work for close to a decade now. Still can't say I prefer one over the other. I have no problem seeing pixels, especially now that I've switched to Linux (KDE Plasma) at home. In fact I kind of like being able to catch a glimpse of the building blocks of the virtual world.

What actually does matter (for me) is uniformity and color accuracy. And you can't have that for cheap, especially not in 4K.


Is this with newer Apple Silicon Macs? My 2020 M1 Mac Mini looks unremarkably normal on my 1440p display. I'm also going between that and my 14" M1 Pro Macbook Pro, which of course looks beautiful but doesn't really make the 1440p on the Mini 'bad'.

Edit: Adding that both of these machines are now running macOS 15.1 at this time.


In my experience, you can’t do any sort of scaling with sub-4K displays. This is “since M1”. Intel Macs, even on the latest macOS, can do scaling eg 1.5x at say 1440p, which last time I bothered with an Intel Mac required a workaround via Terminal to re-enable.

But that workaround is “patched” on Apple Silicon and won’t work.

So yes if you have an Apple Silicon Mac plugged into a 1440p display, it will look bad with any sort of “scaling”- because scaling is disabled on macOS for sub-4K displays. What you’re actually doing when you’re “scaling” on say a 1440p display is running that display at 1920x1080 resolution- hence it looks like ass. Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up to appear as though you had a 1920x1080 display- since it was still utilizing the full …x1440 pixels of the display, “1920x1080” looked nicer than it would now.

So brass tacks it’s just about how macOS/OS X would obfuscate the true display resolution in the System Preferences -> Displays menu. Now with Apple Silicon Macs, “1920x1080” means “2x scaling” for 4K monitors and literally “we’ll run this higher-res monitor at literally 1920x1080” for any display under 4K resolution.


> Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up

I’m almost sure that macOS can’t do that. It’s always done scaling by rendering the whole image at 2x the virtual resolution and then just displaying it on whatever screen you had. For example, for “looks like 1080p” on a 1440p screen it would draw onto a 2160p canvas (and do 2160p screenshots).


BetterDisplay does this. It adds HiDPI resolutions which render at 2x and then downscales.


Yeah I was never able to get that to work on M1/M2 Macs. Intel, sure, but none of the workarounds (including BetterDisplay) worked on the ARM Macs. Do they now? I last tried in 2022.


If your 1440p monitor looks “fine” or “good”, it’s because the scale is 1x - for many people, including myself, UI elements are too small at 1x 1440p. I had to buy a 4K monitor so I could have larger UI elements AND crisp UI elements.


You may just not be seeing the visual artifacts on your screen because you don't know what they look like, or mentally adjust to what that screen looks like.

The same way someone might not notice motion smoothing on a TV, or how bad scaling and text rendering looks on a 1366*768 panel, or different colour casts from different display technologies. All three took me a while before I could tell what was wrong without seeing them side by side.


> You may just not be seeing the visual artifacts on your screen because you don't know what they look like, or mentally adjust to what that screen looks like.

Does any of that matter, though? Who bothers with the existence of hypothetical artifacts in their displays they cannot even see?


It matters once you get used to something better. Our brains are really good at tuning out constant noise but once you start consciously recognizing things it’ll remain noticeable. If your vision slips you won’t constantly be walking around saying everything is fuzzy but after using glasses you’ll notice it every time you take them off. Low-res displays work like that for many people – back in the 90s, people were content with 800x600 27” monitors but now that would feel cramped and blocky because we’ve become accustomed to better quality.


This is the biggest issue with Mac hardware at the moment. All because of a decision to make it easier for their developers (and 3rd party too I guess) to be able to claim they figured out high-DPI before everyone else.

It comes at a large cost now, either more money than reasonable for one of the few compatible displays or accept a much worse experience, that is just not good for devices of this price. This is why a big affordable iMac is so necessary, but TC's Apple likes money too much to care about their legacy customers.

After such a long history of Mac OS having better font rendering and in general better graphic stack (Quartz, everything is basically a continuous PDF rendering) feels like a big letdown.

The problem is going to improve as more high-DPI displays are released for sale but it has taken a lot of time because most customers like to focus on other characteristics that are arguably more important for other use cases. There are plenty of premium display that are just good to great but you really have to think how it will work if you buy a Mac, most likely you'll need to compromise, feels bad considering the price of admition...


Wait about what kind of people are you talking about and how niche is that target group?

You are saying Mac are expensive but at the same time the potential buyers cant afford even a cheap 4K monitor? They go by like 200$? now. and even is that group exists.. its not like 2560p is torture on a Mac especially with that BetterDisplay HiDPI, I would bet many would not even notice the difference.


Unless you want to use your cheap 4k monitor as an equivalent 1080P display (which is not a lot of space for today's standard) it's not at all viable. 2560 is actually 1440p and no, it's not very good, even with BetterDisplay without even talking about the performance and rendering implications.

The fact is that if you buy an expensive Mac desktop, you need to also buy one of the few expensive displays that can work properly with it, otherwise you get degraded experience or compromise, which is unacceptable for hardware this price. We are in this situation only because of both engineering and commercial decisions from Apple.

Considering that they sold an entry level 27" iMac for a lower price than what the Studio sells for, well into 2021, the position is indefensible. Even if they wanted to make an external display, they didn't have to make it that overbuilt/expensive without any other choice. It is purely profit motivated move, because they want to extract 50% margin on everything, the old iMac was a low margin device and it's really the only reason it doesn't exist anymore. Conveniently all of that is supported by an unnecessary engineering decision (they didn't have to remove subpixel rendering).

Why do you feel the need to defend a mega corp terrible choices, only made to milk you as much as possible.

As for the niche thing, having installed/managed quite a few of those old 27" iMacs, I can tell you they were extremely popular, precisely because they were cost effective. I think you largely underestimate how large the customer base for the 27" iMac was. As far as I'm concerned, it's way less niche than a 650$ headphone, but the difference is that they can milk 50% margin on those while a big iMac at that margin target would put it in a price territory where most wouldn't even consider.

So yes, you can get a cheap display to have a price contained setup, but it's really not a great experience and doesn't make a lot of sense to comprise if you are going to get an expensive computer in the first place. At this rate you can get All-in-one 27" PC for not much more than a standalone display, it's not going to be great but at least it's cheap. If we are going to compromise, at least do it right.


My silicon Mac is fine on 27” 1080 10 years old display


Can confirm, you absolutely need BetterDisplay and a tiny bit of elbow grease to configure the 5k clone to downscale to your real monitor. Not rocket science, but could be more streamlined.

If you say it looks fine without it, I don't know what to say.


Is there a review that demonstrates and corroborates this issue? Is it a difficult problem if choosing to buy a new display for a Mac mini? My old display is 10 years old and I would have to get a new one then.


It's most visible with the macbooks because you have the retina display and the low dpi display next to each other.

In short: you probably want to get at least a 4k display anyway, but if you want to delay that, you should buy BetterDisplay. The difference is night and day.


My 7 year old QHD monitor pair through a M1 Pro MBP still looks fantastic. Then again, I do spend most of my day in apple Terminal, but I'm not really in want of anything more. Some other sibling comments are saying Windows 10/11 looks crappy, and I agree, as I have to occasionally switch between the two, I just don't like working in Windows anymore, mostly because of the poor display.


I use both OS on the same display and Windows looks much better on an "old" no Hi-DPI display, I can tell you that much.

I used to dislike Windows font rendering, but it's still better than what macOS gives you for "regular" displays. You can fix it somewhat with BetterDisplay but still...


Modern versions of macOS don't support subpixel rendering, what Windows calls ClearType, so that is why macOS will always look worse on low resolution displays.

Using BetterDisplay to force a "2x" resolution will give you better rendering but at the cost of lower usable/effective resolution.


Yeah sure I know that, which is why I relate my experience to that is the result of those engineering choices.

It's pretty funny that you need special hardware to keep nice macOS font rendering to stand in comparison to Windows.

Microsoft has a lot of problems but they are way more pragmatic in their choices giving a "good enough" experience on most hardware. But if you don't follow Apple's choice, your experience can go from great to barely passable in an instant.

Very different approaches, but as I'm getting older, I understand why the one from Microsoft is popular and why they deliver more value.


If you have a 1440P 27" monitor, they work great.


Basically operating at standard pre-Retina Mac DPI levels. The 27" Apple Cinema Display had exactly this resolution, as well as the 27" iMac before it went to 5K.

I agree, it works… fine. But sadly more and more elements of modern macOS will look blurry / aliased because they are only made with hi-DPI in mind.

For example all SF Symbols, as far as I know, are not defined as pixel graphics but only stored as vectors and rasterized on the fly. Which works great at high res and makes them freely scalable, but on low-DPI displays they certainly look worse than a pixel-perfect icon would.


No, it looks great on my 1440p OLED. Windows on the other hand in old Control Panel for example it looks like ass.


There is an app called better display that almost solves this. Has a mode that renders things at higher res and fixes the text blurring


Came here to echo this. Also, it always amazes me how many people respond to warnings like this (as seen in this thread as well) saying lower-resolution displays look just fine. I returned a M2 Mac Mini solely because it looked so awful on all of my monitors -- I tried 2 different 32" 2k displays, plus a handful of 24" displays. Everything was fuzzy and awful looking. Not something that could be tolerated or ignored... Completely unusable. I feel like this fact is not well known enough.

The fact that so many seem to tolerate "low-res" or "mid-res" displays on the current M-series Macs is really puzzling to me... maybe my eyesight isn't as bad as I thought it was and everyone else's is a lot worse!?

This new M4 mini is tempting enough that I might try a Mac again... but this time I am definitely going to have to budget for a 4k/5k display.


Honestly I am going to say skip 4K and just go to 5K. They are not that much more. I have 2x5K setup and it is great. The main monitor is normal orientation and the other is mounted on the left at a 90 rotation centered on the side of the first. I keep my work on the main and all the documentation, chat, etc. on the vertical one. I hope to be able to ditch the 2 monitor setup next year and go to a single 8K display.


4K displays are the new standard. I can buy a 27" IPS 4K display from LG for $200. Anything lower res. is a boomer screen; get rid of it and move on.


Non-Apple displays have awful PPI, even the allegedly high-DPI ones.


How does that address the point the person you are replying to make:

> something like a 1440p monitor will look much worse on macOS than it would on Windows or Linux.


As far as Mini PCs go, I'd consider Intel NUCs (currently NUC 12 Pro line), primarily because they have great firmware lifecycle support. A lot of cheaper brands in the NUC-like Mini PC space don't consistently release firmware updates (if at all), e.g. to fix security vulns, which is a deal breaker for me. Intel NUCs are validated for Ubuntu and RHEL. The main downside is that you'd be relying on an iGPU for gaming, so do some research on whether the Core i5-1240P's Xe iGPU would be able to handle the games you're interested in.


At first I thought the same, but their North American dataset includes countries in Central America and the Caribbean. The US does a relatively good job within its borders, but could do more to support waste management efforts in less wealthy countries.

https://ourworldindata.org/grapher/ocean-plastic-waste-per-c...


Related: There's a startup with a laptop focused on repairability, without sacrificing much thinness or style: https://frame.work/about


Did the headphone jack get removed because everyone wanted it gone, too? :P


They asked for thin and water-resistant. Apple removed the head-phone jack to achieve that. People were initially surprised but Apple was right that really they didn't need it anyway and they're fine about it now. So much so that other manufacturers copied it. People got what they wanted!


This is incorrect; there are several phones that are water resistant and thin, despite having a headphone jack.

The primary motivation was to help sell AirPods (which customers then have to discard and buy new ones every year, because of the battery degradation.)


> The primary motivation was to help sell AirPods

This is just cynical and uncharitable.


Lol!


I think you need to apply Occam’s razor here:

The iPhone is massively massively successful, iteration after iteration.

Are Apple doing that by not giving people what they want

That doesn’t seem likely to me.


I don't think you can chalk apple's success up to the hardware, or at least not just the hardware.

Despite my... dislike for the iPhone ecosystem and the way apple runs their app store, it does produce a level of quality and safety that most other systems don't. It works pretty well for people who don't want to have to google things.

I think that for some people that would be worth it even if they don't get the hardware they want (so long as it meets some theoretical "good enough" threshold).


Is it not possible that they give people _almost_ all of what they want, while making one or two changes that are detrimental but not quite dealbreakers?


This is a ridiculous argument against the claims above. If you've invested $100s in iOS apps, you've maybe got homekit, and other apple products, lightning cables, you use iMessage etc, the fact that headphone jacks were removed or batteries are not replaceable and that the products still sold is not proof that's what consumers wanted as there are too many tradeoffs.

The only way to truly tell if that's what consumers wanted would be to go back in time and release 2 nearly identical products. An iPhoneX without headphone jack, non-replaceable battery, water resistant, and an iPhoneX with headphone jack, replaceable battery, same price but not water resistant and see which people choose. I suspect at the time most would have chosen replaceable battery and headphone jack.

Also BTW there are phones that are waterproof that have a headphone jack. Those 2 things are not mutually exclusive.


Apple are giving the best option for their userbase compared to other products on the market. That doesn't mean it's all, or even mostly, what these users want. It only means other products give less.


I enjoy the Axios news format; it's pretty concise, and you have the option to dig deeper into a given topic or event.


I don't know about prettiest; I'm a big fan of the 787's sleek, flexible composite wings. :P


I've been thinking about the same thing since Google impressed everyone with their computational photography tech years ago. The software in DSLRs is really primitive in comparison.

2014: https://ai.googleblog.com/2014/10/hdr-low-light-and-high-dyn...

2017: https://ai.googleblog.com/2017/04/experimental-nighttime-pho...

Really makes you wonder what could be achieved by combining Google's computational photography algorithms with a bigger sensor and high quality optics. I'm probably not going to buy another high end camera, until I see something like that on the market.


Indeed, a traditional camera with the post-processing of a smartphone would be incredible.

But indeed, camera manufacturers are way way way behind. The best they can do at the moment is find faces on a picture (and still, not that well except on Sony cameras). This is disappointing to say the least.

The Zeiss ZX1 runs Android, but I don't know how much computational photography it provides, since all the reviews about it only emphasize the fact that it can run Lightroom (which sounds completely useless).


there is e.g. https://magiclantern.fm/about.html for Canon DSLRs that allows you to do some more interesting things, it's been around for a decade or more


For what it's worth, the Superfish and LSE BIOS scandals didn't apply to ThinkPads. I think Lenovo understands that they have too many serious business and gov clients using ThinkPads to risk doing something silly like that to their professional-grade ThinkPad brand.


> For what it's worth

Not much in my book. The problem isn't Superfish, the problem is leadership that allowed it.


ThinkPad is under rather different leadership from Lenovo's consumer division that had the Superfish debacle on IdeaPads and the like. Sure, they are part of one corporation at the very top, but you don't have to go very far down the org chart before they split into separate teams and leadership.

ThinkPad is from the old IBM teams in Raleigh and Yamato. Lenovo made their own laptops before buying IBM's personal computer division, and that line (and its management) became IdeaPad.

If you're troubled by leadership that would allow Superfish (as I am), buy a ThinkPad, not an IdeaPad.

Previous discussion:

https://news.ycombinator.com/item?id=20240533


I shouldn't have to learn about the internal structure of a company in order to buy a laptop without malware.

Maybe Lenovo should have thought about their internal structure and their brand reputation before installing malware on their laptops, or maybe not (because they don't care about clients like me, they care about the 90% of bosses that buy bulks of Thinkpads and don't know what firmware is). But anyway, it wasn't a rogue engineer who did it, it was Lenovo, and in my eyes: Lenovo ships malware.


Of course it's up to you to decide what computer to buy or not to buy, based on whatever criteria you see fit.

But I don't think you're doing yourself a favor by ruling out ThinkPads just because of a boneheaded decision that Lenovo's consumer division made a few years ago. ThinkPad and IdeaPad really are two separate organizations under one corporate umbrella.

Superfish was not something handed down from on high, it was the bright idea of the consumer group. The ThinkPad team would never go along with something like that; it's not in their DNA and it would destroy their business. Their bread and butter isn't you and me, it's large organizations with IT and security departments who deploy hundreds of ThinkPads at a time and look very closely at the software on them.

Only offering food for thought, it's cool with me whether you buy ThinkPads or something else. :-)


Personally, I agree with the op. If we want to send a message that malware in our BIOSs is absolutely unacceptable, it makes zero sense to give Lenovo any business.


I don't see how boycotting ThinkPads sends a message that BIOS malware is unacceptable. ThinkPads never had that, and never would.

Anyway, I don't usually buy or not buy a computer to send a message. I buy one because it meets my business and personal needs. I've been using ThinkPads for over 20 years, and they have served me very well.

You may choose differently, and of course that's fine.


> I don't see how boycotting ThinkPads sends a message that BIOS malware is unacceptable

It sends a message to other manufacturers: add malware at your own peril. I frankly consider it unethical to buy or recommend products from companies, like Lenovo, who demonstrated anti-consumer behavior because it perpetuates bad behavior as companies think consumers will forget or forgive them.

> ThinkPads never had that, and never would.

That is speculative. I can't know that whatever harmful and irrational environment that led to Superfish in IdeaPad won't affect ThinkPads in the future. Even in the most generous understanding where IdeaPad is a different, physically separate branch of the company, and Superfish was an act of incompetence and not outright malice I can't be expected to keep up with the insider intrigue of the company to notice any changes that could negatively affect me. More importantly, leadership is still responsible for setting irrational environment that lead to Superfish, whatever that environment was. This is a multi-billion dollar company, there is no excuse for such incompetence.


> I think Lenovo understands that they have too many serious business and gov clients using ThinkPads to risk doing something silly like that to their professional-grade ThinkPad brand.

I think that they rely on their A team to develop the malware targetting business and government clients, rather than the C team responsible for Superfish.


For enterprise networks, you can still analyze the pattern of packets to match fingerprints of C2 traffic. There are solutions such as Cisco's Encrypted Traffic Analytics, that do just this: https://www.cisco.com/c/dam/en/us/td/docs/solutions/CVD/Camp...


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: