I'm not trying to be a dick but a few things said here are just wrong, or at least explained in a way that is totally misleading. Below I will try to correct those things, with examples where I can. No names, just the facts. And before you try to argue back, understand that I've been doing this for more than 20 years and for about a third of that time it was my job to know this stuff so I could advise and train our customers about it. To be fair, I've never had to deal with all the crap that goes on with high DPI images inside an application, only on how screen technology affects what we see and how we see it.
On devices with low PPI, antialiasing helps the cosmetic appearance of text at certain sizes (get too small and it's disabled because all it does is make the text blurry, harming readability). It sometimes helps readability for some people, but some people also hate antialiasing (I'm not one of them).
However, the higher the display PPI, the more pixels you have to work with, and the less antialiasing is actually needed (and becomes invisible to the human eye or can be left out entirely).
makes vectors (like text), look better. It's just at high dpi the effect is more subtle because adjacent pixels are closer together. In fact, anti-aliasing benefits from high dpi screens because it can be applied more aggressively (different algorithms) and noticed less.
Because desktop computer displays historically had 96 pixels per inch or less (72 and less in fairly old stuff), the pixels are too large and too few to fool the eyes into seeing smooth curves. Antialiasing was invented to improve the cosmetics (and sometimes readability) of the many curves used in text on a low PPI display.
Sorry but that's not how it has ever worked. Historically, computer displays were CRTs which don't have any pixels at all. CRTs are analogue devices, it's just that computers spit digital stuff at them so you end up viewing pixels because that's what the computer generates. So even 25 years ago it would have been possible to run a tiny CRT (e.g. a point-of-sale screen) at 4k or even higher resolution, if you had enough graphics power to make it work. Displays only became digital when we transitioned from CRT to flat-screens. Since then it has been preferable to match the computer's output resolution to the screen's fixed pixel resolution. In the early days they weren't great - backlighting was poor and it was hard to get good screen contrast or bright colours. But things have improved immeasurably since those days.
PPI on a flat panel monitor is widely variable. e.g. Whether I look at the output of my Surface Pro on it's 10.5" screen or on my 32" TV, it is still putting out 1920 pixels by 1080 pixels. Therefore, the PPI in one is completely different to the other. The 72/96 dpi thing is something the OS uses internally to work out the relative size of bitmaps on screen, it doesn't necessarily relate to the actual PPI you see when you look at it on a monitor.
Antialiasing smooths the staircase appearance (aliasing) by inserting extra pixels into the empty 90° spaces between the point-to-point arranged pixels.
This is not how anti-aliasing works, simply because there is nowhere to get "extra pixels" from to insert anywhere. What anti-aliasing does is it smooths sharp boundaries between colours by mixing them together at the edges. e.g. If I have a white line on a black background, anti-aliasing will take the pixels on the edge and blend them, so you get a row of pixels that are grey. Depending on the type of screen and the algorithm used, you might get the edge of the black turned to dark grey and the edge of the white turned to light grey or it may turn one or the other edge into 50% grey. On a high PPI screen you may even get multiple rows of pixels on the edge being recoloured for a super-smooth edge that would look soft and blurry on a less dense screen. I've made this image in Photoshop, where I can use different AA algorithms to get the effect I want. It's white text on a black background. I zoomed in 1000% and took screenshots so you can see the effect on individual pixels. The left side shows no AA, the centre one uses the "Crisp" setting most people usually use and the right hand one uses the "Mac LCD" setting, which is the softest available.https://imgur.com/a/pW3fAXW
Hmmm... it looks like KVR won't display images from Imgur for some reason, so you'll have to go and look at it instead. But you can clearly see how it has made some of the pixels around the edges grey to make it look smoother and more rounded. If you zoom out to 10%, you'll see it as it was originally created.
This is subjective. BONES' opinion on what constitutes "amazing clarity" is entirely different from my own. Fair enough.
The whole idea of the ClearType utility (not the use of ClearType in the OS itself, but the calibration utility you use to personalise the experience) is to take into account everyone's different eyesight and produce the perfect result for YOU
. For your eyes. So it is only subjective in that it understands we all have different vision and what works for me might not look good to you. But if yours isn't perfect for YOU
when you're done, it's because you didn't do it properly. Seriously, it's like the difference between going to an optometrist and getting a pair of prescription glasses tailored to your own vision and using those $5 glasses you can buy at a petrol station that just magnify everything. In fact the process is very similar to an eye test - it shows you two lines of text and you pick the one you think looks best, then repeat the process a few more times until everything is properly optimised. It has to be better than a one-size-fits-all approach like you get in every other OS.