I have one. The US version "only" has a 4800U, but it was still an absolute steal at US$800. It's a nice size, feels solid, and lasts all day easily. I've used it all day more than once and not seen the battery go below 50% yet. Then again, I haven't really pushed it that hard. I'm sure if I played some games it would "only" last five or six hours. And no, I haven't had any issues with the screen. In fact I turn it down. If you're staring at a 500nit screen at full brightness all day, you're probably not doing your eyes any good.
Unfortunately this model seems to be sold out or perhaps even discontinued (until the European 4900U version becomes available maybe). Meanwhile, the E14 Gen2 14 is very similar in both specs and price.
> If you're staring at a 500nit screen at full brightness all day, you're probably not doing your eyes any good.
I'm kinda surprised how many people set their screens to 100 % brightness (300-400 nits for most screens) on their desktop. I find that blinding and very uncomfortable. That also seems to be a reason why people complain so much about IPS bleed and glow; using the screen near full brightness in a dark room for gaming or movies. Personally I find the "0 %" setting on some screens where that's around 50 nits too bright for that (ahem LG).
It’s hard to change the brightness of desktop monitors. Bafflingly, Windows and macOS don’t support adjusting their brightness the way they do for laptops, despite the existence of DDC/CI to do just that, so you’re left using third-party software if you know about it, or interacting with the awful OSD/buttons on the screen, which you’ll hardly want to do all the time.
For Windows users, I can recommend ClickMonitorDDC[1]. While the UI is a bit cluttered, it has a neat feature:
You can display the current brightness in the notification area, hover over it with your mouse, and use the scroll-wheel to adjust it. I really like it.
I actually think I tried this one yesterday, and from memory it spammed my traybar with about 12 different icons. The UI isn't just cluttered, it's odd!
Still, it's a nice demonstration of what you can do with DDC, and just about anything is better than the crappy physical controls on monitors.
you can disable them in the settings. it gave me icons for brightness, contrast, saturation, and volume. I only care about setting brightness so I disabled the others
I remember when TV sets and monitors had dials to adjust brightness, contrast and volume. They became useless on TVs because remotes are better, but they'd still be very useful for monitors that stay all the time within reach of the user. Much better than menus.
I recently discovered MonitorControl[0], which is a Mac app that listens to your default brightness and volume keys and pushes updates to all connected monitors over DDC. Extremely happy with it.
I'll quote an HN discussion from a few weeks ago on why this isn't more prevalent.
>There is a problem with "cheap" monitors and DDC/CI: some of them use EEPROMs to store brightness settings, and this limits you to about 100,000 writes.
Worrying about this is the main reason we don't ship DDC/CI with f.lux. (I know that some more modern monitors use NAND and don't have limitations like this.)
I dunno. The OSD on Dells is pretty simple. You press the topmost button twice, up/down to adjust, lowest button to save and exit. Since my office has windows, I've been doing this two to four times a day on most days for years. Doesn't bother me too much (it's pretty quick though because I largely only use the 0-40 range, which is about 20-120 nits). On my LG it's even quicker, push the joystick back, then forward/back to change, push in or timeout to save and exit.
Some people say they'd like ambient light sensors, but I'm not sure I'd wanna use something like that. Sometimes I do find the changing brightnesses on mobile devices irritating.
Changing this directly in the OS would be a better UX, though.
That might work if you only have one display. But on more than one that gets tiresome real quick.
On linux I use ddccontrol to control the brightness on all displays at once (using a simple for loop in the terminal).
I plan to write a script for it that would allow me to bring up a dialog and enter brightness from just a shortcut, but this is good enough til that itch comes.
I'm sure there are alternatives to all operating systems.
I have a Dell U2415 which is great for this. You can create custom colour/brightness/contrast profiles and assign them to hotkeys. I have one nice bright one for dark terminals or when the sun's out, and a darker lower contrast one for when I have a bright white website or document to read and the office is a bit dimmer. It's two quick button presses to change between these presets.
I think I'd quite like something which adapted brightness and contrast to ambient light as well as the brightness and contrast of what's on screen. With mobile devices this can be a pain as you move around and in and out of shadows, but I feel it could work a lot better on a desktop display. The display could even have the light sensor on its back so it can work out what its backdrop looks like.
I haven't had a monitor in ~5 years, but the ones I owned had physical buttons for adjusting brightness/contrast/gamma/etc... Do new monitors come with no in-built controls?
I paid a significant markup for an Eizo monitor not because I care about the color accuracy and all that but for OSD where I don’t mind adjusting several times a day.
> I'm kinda surprised how many people set their screens to 100 % brightness
Not everyone knows that this impacts your eyes. Someone called me and asked if I know anything about glasses that would restrict blue light and make his eyes feel better. After telling a thing about f.lux, we ended ended up finding out that brightness was at 100%. He lowered to 50% and said: thank you, much better.
I've been staring at a screen for decades, but only fairly recently found out about the harm of blue light, and am using f.lux with the "soft light" effect.
Aside from blue light, is it bad for your eyes just because of the brightness level? Is there any kind of objective measure of how bright is "safe"? And if so, is there any way of knowing how many nits a monitor is emitting?
AIUI the problem is mostly the blue light. I can't imagine that becoming habituated to staring at a bright light is all that good either, but I don't know of any specific studies on that. It's also the kind of thing that only shows up statistically after twenty years, as a higher incidence of inability for the eye to accommodate to lower light levels etc. Nobody's going to look at a too-bright screen for a day or two and immediately notice a drastic difference.
Do you have blue eyes?. I do, and in summer i can't look at the ground without being blinded. Read somewhere that the lack of pigments lets more light through. Comes with the upside of seeing better when it is dark. I use redshift on all computers plus switch to a different colourprofile on my monitor at night. On mobile i use the nightmode all day.
I don't have blue eyes but when I was younger I can read in the dark and hate high brightness monitors. Now that my eyes are old I lost the ability to focus my eyes and need glasses and sometimes need the monitor to be brighter to help focusing.
I don't think I ever used 100% brightness on LED backlit displays. They're just too bright. Mine are currently sitting at 75%, and I had AOC monitors (with LG panels heh) that were too bright at even 50%.
Sadly, anything below 50-60% and I notice the PWM, sort of flickering, and it's really tiring - I know I'm not buying the most expensive stuff but I'd like them to use better backlight controllers.
Last time I talked with an eye doctor (some months ago) and he mentioned that the ideal brightness should match the one of the environment around. if it's darker your eyes work harder and if you are working without environment light (dark room) is bad because you blink less!
Screens that can go brighter are probably better for us, because it means we can use it in sunnier / brighter environments properly, with all the health benefits that sunlight can give us.
That's more of an argument for reflective/transflective screens. Trying to overcome both sunlight and reflections with a transmissive screen doesn't work very well, and only exacerbates the eye-health problem. Not great for battery life either. That's a lot of problems to be solved before there's even a chance that people would use their computers more outdoors, and even then the odds are slim. Maybe people should get outside away from their screens.
That's like saying that we shouldn't read books on the beach. You have to meet people where they are.
A secondary reflective screen on the outside of a laptop might be the better solution, but from what I remember reflective screens usually don't have good colors. The last TFT reflective screen I remember has been hard to find an outdoor picture of although.
> That's like saying that we shouldn't read books on the beach.
It's nothing like that. People do read books outside. They don't use laptops outside. (Statistically speaking.) "Meeting people where they are" means two very different things in those different contexts.
> reflective screens usually don't have good colors
Yes, you have to pick your tradeoffs. If you want an outdoor-viewable screen, that will probably mean some sacrifice in refresh time, resolution and/or color gamut. This is why the only place you do find reflective screens is dedicated e-book readers. Brighter transmissive screens are not a solution in this problem space, so this use case does not justify them.
I use my desktop display at 100% brightness because turning it down means its great color and contrast capabilities go to waste due to its traditionally backlit nature… it’s like downgrading my monitor to a cheaper model.
This effect isn’t nearly as strong on my phone’s OLED. With that, I can turn brightness down quite a long way and still have great color and contrast.
OLED/microLED desktop monitors can’t come soon enough.
I always used to set my monitor to 100% brightness, when I used glasses. I had an eye operation 2 years ago and from the first day I could tell that it was too bright. Normal prescription glasses have light filters that makes you not notice this type of things.
Same here. In the evenings and nights with warmwhite lighting 12% brightnes and 10% contrast, during the day usually between 25% to 33% and even during brightest direct sunshine at noon no more than 50%.
I wasn't entirely sure about the mere R5 4500U in the laptop I bought recently, but it tears through everything I throw at it. The only thing that stopped me from getting one of the higher-end R7s was that all the laptops with it and a screen with full sRGB coverage were persistently sold out, but I don't feel like what I got falls short with any of the tasks I put it to.
Whatever dark forces AMD aligned itself with for the latest chips was worth it.
I've had it a bit over a week. The E14 Gen 2 still seems like the closest alternative - slightly slower processor, more memory, less battery. Unfortunately the Labor Day sale is over so it's around $1200 now.
For people interested in ultrabooks with Ryzen 4800U processors, there is also another model, which for some reason doesn't get mentioned much, which is Lenovo S540 13ARE. It has a 13.3", QHD, 16:10 screen.
I bought this laptop after buying a Lenovo Ideapad/Yoga Slim 7 and returning it (because of some QA issues and 14" was a bit too big for me after using XPS 13 for a long time). I made a small review of the laptop here: https://www.reddit.com/r/Lenovo/comments/ilcw5n/lenovo_ideap...
Unfortunately the machine is not available in Europe outside of the Netherlands for some reason.
I tried calling a Dutch retailer to ask if they ship to Austria but couldn't even get pass the "Do you speak English please?" phase :(
Like WTF, the EU single market is a thing since how many decades now?! So why the hell do we still have region specific SKUs of the same product with different parts and availability between EU member states?! Imagine the laptop would be available for sale in California but not in Utah and in California you can only get it with a 512 Samsung SSD and a 300nit display and in New York only with a slower 1TB SKHynix SSD and a 400 nit display! /rantover
Because charging different countries different amounts of money for the same SKU is illegal in the EU, so instead every country gets its own SKU, and they don't ship to other countries.
You found the ONE Dutch that doesn't speak English? I went there about 15 years ago, everybody I met, including old ladies in the street, were speaking much better English than I can (I'm French).
I don't think the guy didn't speak English, I think he was too rude or indifferent to bother offering customer support in English at that moment. You see this in Austria also if you try to get customer support over the phone in English, since lots of employees are stressed from this kind of job so some will just not bother with you with a not my job, I don't get paid to be a translator attitude if you ask in English.
That reminds me of when I visited Vienna many years. I went to the information counter at the train station. I didn't want to be rude and presume the information person spoke English. So, I asked if he spoke English. His bemused reply was "Yes, I speak English and 7 other other languages".
Many European countries have specific keyboards so for laptops you need a dozen or more models just for Western Europe. I wouldn’t want to accidentally get a German or French SKU when I’m buying one from Sweden so I’m quite happy there are different SKUs.
I love my Lenovo X1 Carbon with a 14" QHD display.
QHD is not a popular resolution but it's absolutely perfect for this size, like you mention. It's better for battery life than 4k and it's much crisper than 1080p.
With a 13"-14" screen, you would have to scale a 4k display which is a complete waste in my mind. You either lose out on sharpness by getting fuzzy fractional scaling, or you go full 2x scaling and you lose all the screen real estate of a high pixel density display. 1440p is beautifully sharp without requiring scaling on a 14" panel.
I'm pretty confident you're in the minority as reading text on that thing will be very straining on your eyes in almost no time. I have a 20/20 vision and sure I could use a 4k on the same size for a few hours but it wouldn't be comfortable.
Hello I just wanted to notify you about a new development on display colors. Yesterday, I was fiddling with AMD display settings and I disabled "Vari-Bright" feature just to see what happens. I never tried to disable this before, because I read that it helps with saving battery life by lowering the brightness depending on content on the screen. But to my surprise, disabling it not only changed the brightness but affected colors in a good way too. And now I am much more happy with the colors. I don't know why they enable this feature by default when it makes the colors noticeably worse.
I don't really think cloud vendors are holding their breath to switch to ARM, they are all heavily invested in X86. All their code has been built on it for forever, and there's a big advantage to having your dev machines running the same architecture as your cloud production machines. I think desktop ARM adoption would have to happen before the server market moves to the same.
AWS has multiple generations of its own ARM processor. I can't say anything about it, as I have no experience with it. I know it's at least two generations in and they claim 40% cheaper for similar workloads.
What I can say is that this level of investment is large.
> their code has been built on it for forever, and there's a big advantage to having your dev machines running the same architecture as your cloud production machines
Is that really so? Are remote development and emulation not sufficiently advanced yet?
Super hot paths might be x86 optimized, but how much does that really matter? I'd think at the scales of the big providers nothing matters more than performance/power use and performance/price.
There’s still no real justification for vendors to even offer ARM instances, and instead become more clear to keep x86 instead with the rising of remote development and emulation.
> There's a big advantage to having your dev machines running the same architecture as your cloud production machines. I think desktop ARM adoption would have to happen before the server market moves to the same.
The new Apple machines are coming out next year, right?
Intel's mobile chips (below 25W) are still great for performance and power consumption because that's only product that uses 10nm and Sunny Cove cores and their low-power optimization is matured.
My colleague got last year's x395 with the AMD 3000 series. It only lasts 3.5 hours battery with web browsing and moderate coding, whereas it's supposed to last ~7 hours real-world on windows.
I can’t speak for the new series, but I have a couple of Ryzen 3400G and I’m not sure if I got the iGPU working and being utilized 100% properly yet.
The drivers are a huge mess of confusion (what goes into user space vs kernel? What do you really need? What even goes into host OS vs containers if you run Docker? What if anything can you actually get out of it without installing the proprietary non-free closed-source amdgpupro?), AGESA updates are needed to not have kernel modules crashes, oh and these updates are left up to mobo vendors, some of which are great, some of which will make you feel like you bought a lemon. And then there’s the whole mess with mesa that I think is just now resolved (20.1) and haven’t yet made it to LTS distros.
I’m def not an Intel fan but man, 100% working intel drivers are an apt install away and I had both forgotten just what a PITA ATI was with Linux and couldn’t imagine AMD hasn’t stepped up the game at all.
Unless anyone has anecdotal evidence otherwise, make sure you set aside a couple of working days to hunt down and compile the right kernel modules and make sure the vendor provides recent enough firmware and/or hav patience.
In short, I wouldn’t hesitate having a new Ryzen for a headless server, or a desktop rig with a dGPU. For a smaller desktop, laptop, or anything else requiring use of the iGPU I would wait a year. Unless you’re one of the few people either already up to speed on all this or finding some absurd pleasure in learning about it, in which case I really do hope you post your process in a blog or forum where other users will find it through web searches.
My ryzen laptop has a 3500u cpu. I did the normal install with fedora 32 kde, and everything works, though with one annoyance. Occasionally a single pixel wide line, maybe 10-50 pixels wide, won't update.
I'm not sure what you are talking about. I installed Arch Linux last week on a new machine and all I had to do was pacman -Sy mesa and that's it. Actually, that's a lie. I installed a DE which installed mesa as a dependency automatically.
You'll want to run a recent kernel, >=5.8 for a renoir chipset. I get maybe 75% of advertised battery life on my fedora rawhide install.
Granted mine is an ideapad 5 with an amd 4500u but it's been terrific. There are a handful of bugs still but nothing that prevents productive work. The worst one is where turning down the backlight too fast kills the backlight entirely, but you can fix it by just bumping the brightness up key and then going back down to your target.
Fedora 32 works reasonably well, just some minor hiccups, such as:
* There is no driver for the Goodix USB fingerprint reader yet
* Occasionally thin lines of pixels don't update correctly (hard to describe, might get a photo soon)
* dmesg logs periodic errors with amdgpu and the new driver for the realtek wireless card, but I haven't noticed any negative functional impact associated with these.
Latest AMD APU's are not fully supported by official AMD drivers on Linux just yet. I also ran into issues with Lenovo's Ideapad with bluetooth (ubuntu 20.04) and wifi (ubuntu 18.04).
I bought the Chinese version, and PopOS as an Ubuntu derivative is working quite well. You need the mainline kernel for the screen brightness adjustment to work though.
> Those results were a game-changer. They’re miles better than I got running the same load on the HP Envy x360 (around eight hours), the Dell XPS 13 (seven and a half hours), the Asus Zephyrus G14 (almost nine hours), and even low-power stuff like Lenovo’s Chromebook Duet (11 and a half hours) for which battery life is a major selling point. I’ll be blunt — this is the longest battery life I’ve ever seen from a laptop. It’s astonishing.
For anyone else wondering about the battery life "over 11.5 hours"
>Running through the multitasking load that I described earlier, in battery saver mode at 200 nits of brightness, the Slim 7 lasted 13 and a half hours. On the Better Battery profile, it lasted over 11 and a half hours. Remember: I was not going easy on this thing — you’ll certainly get even more juice if you’re just clicking around a tab or two.
In the T440 or so models you could have three batteries (one internal, one regular, one ultrabay) in a Thinkpad for a claimed 30 hours or so battery life with two hot-swappable batteries... :)
I have an ASUS G14 with an R7 4800HS and while nine hours are possible (at ~100% it shows exactly 10 hours), you'd have to set the brightness to a pretty low level and forget about doing anything challenging.
But six hours with your IDE open, brightness at the default level for battery mode and compiling a Node.js + TypeScript project from time to time is something you can reasonably expect to be able to do.
Currently, even though I have my charging set to max out at 80%, I don't look at the charge indicator too often because I know that I have a good few hours before it's time to plug in.
If you're in a dim environment, a glossy screen has slightly better contrast and colours. If you have a light source or bright objects nearby where their light falls on the screen, a glossy screen has significantly better contrast and colours, as long as the screen and you are positioned such that you don't see the reflections. If you can't avoid reflections, e.g. because you're outside in bright sunlight, a glossy screen is pretty much useless. But if you mostly have a notebook so you can take it to different indoor workplaces with suitable lighting, and/or so you can move from your desk to a couch or bed (e.g. to watch a movie), the reduced glare can be very nice to have.
So the thing is, if you never had one yourself and only observed others using them while away from their preferred work spaces, chances are you've literally seen glossy screens in a bad light :)
Of course, if your use cases are all text based, the potentially better picture quality of a glossy screen is indeed rather pointless. Either option is a compromise; what's better depends entirely on where and for what you use the device.
Glossy screens generally have more vibrant colours; matte is inherently somewhat muted; and in some scenarios have better contrast and brightness.
(Also: PRETTY! and APPLE! are important psychological factors as well :P. Joking aside, many people will look at vibrant screen and how pretty it looks and make their decision without considering the specs, details and use cases )
For myself, in most situations glossy screens mostly mean I can see everybody behind me better than I can see my work; but they do have their place, especially for professional design/video/photo work IN (and this cannot be overstated) well controlled environment (basically, make everything BUT the screen non-reflective / control the light:).
I tried to put one on my 13" MBP a while back, because I hate how glossy and reflective the screen is. Was horrible to apply, and I ended up with lots of small scratches in the laptop display simply by gently pushing out the air bubbles with a card... I won't be applying a screen protector again in a hurry!
> The desaturation and softness are (AFAIK) inherent to matte screen surfaces and the reason manufacturers nowadays tend to avoid them
If it has to be a choice, then I choose matte regardless.
I bought two for the kids and while they are fine and I’m happy with the value I still despise windows home edition (had to block one from my router during set up so I did not have to make a windows account for instance).
I really really wish Apple would make a reversible 2 in 1. I can’t tell you how much of a better experience that form factor is for young kids. iPads are not a replacement for this.
Linux Mint (Windows-like) is free and elementaryOS (macOS-like) is pay-what-you-want, $0 if you so desire (https://elementary.io/). It takes less time to flash a drive and boot from it than to endure one Windows update on my desktop (although you can also get GNU preinstalled on fairly Thinkpad-comparible laptops from eg. “Laptop With Linux” https://elementary.io/store/#devices, and Thinkpads themselves will likely ship with GNU soon — many Lenovo PCs are already certified for compatibility with multiple major distros).
Minetest is a free (libre) similar game, but very mod-centric, with mods loadable from servers without installing them (as easy as Roblox) and (also like Roblox) written in Lua, may be a fun intro to programming if they’re keen to try it (ofc there friends probably don’t play, but a couple boys I know ~7-10 enjoyed it to a kind of harmful level), btw.
A similarly specced ThinkPad or Latitude is built to last, has user-replaceable components and will not fall apart in 2 years after a warranty expires.
From my experience you get what you pay for with those professional machines.
And a similarly specced Thinkpad will be perfectly usable if purchased in 2-3 years as they cycle out of enterprise fleets, and for a better price than the notebook in the article. Not a satisfying option for everyone, but a great one in my book.
This site[0] is going a little out of date but has good info for several-generations old Thinkpads right now. As a sibling mentioned, Ebay is your friend. The other nice part about Thinkpads is that they are quite user serviceable and you can usually buy any part you need for reasonable prices.
My IdeaPad 540 has two M.2 slots and replaceable RAM (I replaced the 8GB it came with with a 32GB stick). The battery seems very easy to replace, too (only one additional screw IIRC).
I got ThinkPad E495 (Ryzen 5 3500U, 128GB SSD/ 8GB RAM, IPS FHD) on about $360 with huge $110 cashback (in Japan), that's steal. The great thing is that it has 2xSODIMM, 1xSATA, 1xM.2 slots so I added cheap 512GB SSD and 16+8GB RAM rather than expensive BTO option.
I have a HP ENVY x360 15-ee0002na. I like it a lot. Battery is ok, screen even is ok and it has 16gbof ram
Trying to find a Ryzen 4xxx with 16gb of ram in the UK is quite hard! Plenty of Intel's though, it's almost like Intel flooded the market or no one wants Intel
No, it's the opposite, it's not a flood of intel device, it's a shortage of AMD devices.
AMD had to prebook fab capacity at TSMC years in advance and didn't expect the shortages caused by the pandemic made worse by the WFH demand while Intel can make as many chips as it wants to fulfill market demand since it owns the fabs.
It's a shame because the 4800U laptops are either sold out or going for huge markups right now.
Asus PN50. On pre-order (early oct) but I gather the Aussies already got theirs. If you do go that route google RAM carefully...there seem to be compatibility issues (JEDEC vs XMP).
Also, I think the Ryzen 5 is probably better value ratio but wasn't available for buy so went for a 7.
I believe Asrock is also gonna bring out similar stuff but don't know details.
I have an IdeaPad S540 API. It's a Ryzen 3500U. It has 2 M.2 slots so you can had a second SSD if needed. I upgraded it to 32 GB and boy how this beast flies. And it's really cheap.
Under Pop_OS with minor tweaks the battery lasts about 5 hours, which is pretty good for a Linux laptop.
Think of turning of Wifi energy saving: Wifi speed went from 50Mbps to more than 300.
I've got basically the same machine as you as well (ThinkPad E495, 3700U, upgraded M.2 SSD, upgraded 32 GB ram) and while it's great, the 4000 series is a game changer. I really wish I could have waited but my old laptop had other plans.
I wonder if they've improved the screen hinges? I've given up on slim Lenovos as they had this weak metal bonded to plastic hinges which are easy to break through regular use. The screen snaps off the base and the case just gets bent and twisted. They seem very poorly designed and are hard to fix.
These processors are great and OEMs could offer features users want, but they've still been offering only mid-range or gamer-oriented builds for everything else.
Colloquially, “high DPI” has a fairly specific meaning: it means “designed to be used with a scaling factor of at least 2 (and definitely uncomfortable to use below a scaling factor of 1.5)”. 1920×1080 on 14″ does not meet this definition.
And when it’s capitalised, High DPI, as it was in the parent comment, it’s definitely referring to this definition.
Back in the days when 1366×768 and 1280×800 were common sorts of resolutions and 1920×1080 was the highest available (that is, before Apple’s Retina displays), perhaps you could have said 1920×1080 was “high DPI”, but people didn’t use the term “high DPI” back then. And certainly not “High DPI”.
I understand it is good for programming. Can it handle video editing? I am trying to decide between T14 and P1, which is almost twice the price as T14.
Do you need more performance benchmarks? Here's my experiences on Linux (NixOS).
I've done some light Blender work on it and it handled things just fine. Haven't tried GPU acceleration for Cycles, though.
If I bump up thermal limits to 95 degrees then it compiles things at similar speeds to my old i7-6700k machine, if not slightly faster. At stock 60 degrees it's still pretty damn fast, but aggressively throttles if you try to use all cores at once for more than a few seconds. At stock thermal limits it also never gets too uncomfortable to handle.
The GPU is good but not great. Satisfactory via Proton struggled (but got to 50FPS on low settings), older games like Portal 2 run great on ultra settings. amdgpu works pretty well, or at least not worse than amdgpu on my desktop.
Overall, it's a good replacement for my old workstation, can handle some non-trivial video/GPU workloads, and I recommend it. There's some things that need to be ironed out (ACPI S3 sleep is currently somewhat broken, but Lenovo is currently working on certifying this machine with Ubuntu, so that should fix things), but that's mostly because it's such a new CPU/Laptop.
Thank you. I really want a AMD laptop, but they don't seem to have good displays. If I am to stare at the screen 10+ hours a day, I'd like a good screen.
My last couple of laptops have had IPS panels. I intend never to buy a TN panel again.
My last laptop was a 15″ 1920×1080, and my current is a 13″ 3000×2000 (and I love the aspect ratio). I intend never to buy a laptop with a 1920×1080 screen again.
At some point I’m afraid I may end up with a ≥120fps screen and rarefy my tastes still further. (I hear good things about them, but have never seen an LCD with such a frame rate.) Fortunately screens with both a high frame rate and a high resolution are still vanishingly rare.
I just hope someone comes out with a good screen on one of these—I barely even care if it’s super expensive; because it’d be a real shame to have to decide between a good screen and a good CPU.
For desktop use 120 vs 60 Hz is noticeable on Windows (obviously very noticeable on the mouse cursor, but that doesn't change the UX much), but because DWM has basically optimal latency as far as dragging windows around goes, it's not that big of a difference. On Linux it's a pretty huge difference since Linux compositors aren't as good as DWM. Basically, Linux with 120 Hz feels like Windows on 60 Hz.
First thing I usually switch off on Linux is the compositor. It's not that bad typically (you notice latency only if you look for it), but why add unnecessary latency...
I loved wobbly windows in Ubuntu a decade ago, on my first laptop. Switched to i3 in Arch Linux with no compositor on my second laptop. I think wobbly windows was the only thing I missed from Compiz.
> screens with both a high frame rate and a high resolution are still vanishingly rare
They're huge power sinks. If you need to play games on the laptop for some reasons and can have it plugged in all the time, it works. But 4k often take 1-2h off the battery life and the high frame rate will likely have an impact too.
I don't mind 1080p, or 60hz, but I'd love proper HDR OLED for work, so I can have high contrast together with lower brightness.
Remember when 99% of laptops priced around $800 or lower were automatically doomed with a 1024x768 screen - and to get a higher resolution you'd be paying at least 300-400 more?
Things were like that when I wanted to get my previous laptop in 2014 or so; you want 1366×768? Great! We have laptops from AU$400 onwards. You want 1920×1080? Here, enjoy our tiny range of AU$1,500+ laptops, all of which are heavy and power-hungry with dedicated graphics cards because you must want that, right? I mean, why else would you want a decent screen?
It’s not quite so strong these days with 1920×1080 or even with 4K panels, but the segmentation is still definitely real. The feature segmentation, things like pairing dedicated graphics with better screens (even when the screens could easily be driven by dedicated graphics), is particularly distressing, because they’re making you pay more for things that you either don’t care for or actively don’t want, just to get other things you do want.
> I intend never to buy a laptop with a 1920×1080 screen again
Same here and WOW, it's frustrating. There are so many laptops that would be amazing if it wasn't for their screens. Paradoxically 17" models are almost all crap.
Ugh, also those times when >99% of 15″ models were 1366×768 or similar, while 11″ ultrabooks were happily shipping 1920×1080 or even higher. Fortunately that’s almost behind us now.
I bought recently HP 455 G7 with Ryzen 4300U and IPS screen. It's not the best screen but miles better than TN-film panels anyway. Good for a budget office laptop.
I ordered the 445 G7 with 4750U over a month ago. Shortly after, I was notified the estimated delivery was pushed out to a specific date a couple weeks later than original.
A couple weeks later, I got notified that that due to supply issues for certain components, the date was pushed, without a specific date, and I could cancel if I wanted or take a discount when it finally shipped.
This week HP said they cancelled my order altogether, with a list of cancelled business laptop skus. All of them were because of lack of AMD CPUs.
Supposedly I can get a discount on something else, but there's just not much to pick from.
It's a tad ironic (to me) because unless you max out the scaling of the display, none of the MacBook displays have an effective resolution >= 1920x1080. Apple has been defaulting to a fractional scaling the past few years rather than true Retina which is 200% scaling, but even then it falls short[0].
However, 200% scaling is crisp and I can appreciate it. I just don't like losing all that real estate. And the fractional options on MacBooks aren't bad, but I can see text fuzzing out when it's not on the real Retina resolution. So when I'm running without an external display I do bite the bullet and deal with the lower resolution because otherwise I can feel my eyes straining.
I’ve heard that macOS fractional scaling seriously is rendering at one size, and then upscaling or downscaling it to the target size. I’m not certain this is true because I haven’t confirmed it myself and it seems such an obviously stupid idea (and though it’s certainly easier, no one else does it that way because it’s such a terrible idea), but I’ve heard people saying this at least three times (twice on the internet, once in real life), about text not being crisp at fractional scaling. I dunno.
It is always rendering at integer scale and always downscaling to target size (upscaling would result in blurriness; upscaled apps are only those that support only @1X scale).
Just take a screenshot of your desktop and check it's resolution, then compare to physical display resolution. Apple uses output scaler of the final, composited image.
It's not stupid; it is a solution that you can implement without support at application side and is relatively simple. Going Android way means, that all apps have to support random scales, which means they have to ship with assets for that.
Both upscaling and downscaling result in blurriness, though upscaling will generally yield slightly worse results. But downscaling is still going to yield a result drastically inferior to rendering at the correct size. It totally butchers pixel-perfect lines, for example. It’s the sort of hack that would be awful on low-resolution displays, and only becomes even vaguely tolerable on high-resolution displays because it’s still somewhat better than a low-resolution display for a lot of what people are using their computers for, even if for others it renders it legitimately unusable.
If this really is true, I remain utterly baffled, and I maintain my position that it is an obviously stupid idea. Doing it that way just makes no sense to me. The visual result is way worse, it’s more resource-demanding and thus slows things down a little, and it doesn’t really simplify anything for app developers anyway—the only difference is that you have an integer scaling factor rather than a float scaling factor; but all code is still having to perform scaling mappings, and using floats would change roughly nothing (though the changes required in your APIs may need to propagate through a few levels, and GUI libraries will have to decide how to handle subpixel alignment). Windows and Android have both done it properly, so that supporting fractional scaling is no burden whatsoever for developers. You talk of having to ship assets for arbitrary scales, but that’s not a reasonable argument: GUI libraries should always choose the most suitable version of a resource, and scale it to the desired size themselves if it doesn’t match exactly.
The result of taking the proper approach is that users of fractional scaling may get icons being rendered poorly, but images, text, vector graphics, &c. will be rendered precisely and crisply. Meanwhile, this other behaviour people are saying Apple is doing is just guaranteeing that everything is rendered poorly. Surely they’re not actually doing this? Is it perhaps a case of them having erred in making Retina support integral scaling only, but they’ve since made a better version that supports fractional scaling that each app can opt into, but they just haven’t insisted on everyone fixing their stuff yet? (And remember, Apple’s in an excellent position to do such insisting—they do it regularly.) —But as you say, screenshots are scaled at the next integer, which would suggest that yeah, they’re actually doing this mangling system-wide, and there’s no per-app recourse. Thanks for that explanation.
I just find it hard to believe that Apple would truly butcher things this badly. Even if they’ve been known to do weird things a bit like this before, like killing off their text subpixel rendering with no stated justification, to the clear detriment of low-DPI users (and it may still be worthwhile even for high-DPI users).
I can’t check any of this because I don’t use a Mac. There may even not be a macOS device within a kilometre or two of me.
I understand your position; however, the practice has shown, that this approach is good enough quality-wise. Most users didn't notice. In some aspects it is better than the approach you suggest would be, because it takes into account entire framebuffer at once. It will have less problems with pixel perfect lines, subpixel mouse cursor, etc, than the purely software solution, which will struggle with these more.
Also, it is not more resource demanding; the only cost is the bigger framebuffer. The scaling itself is free: it is done by the output encoder (that means output hardware that does the encoding for eDP/DP/HDMI; it doesn't use GPU at all for that[1]). Apple has one more trick: it doesn't offer the user zoom percentages as Windows or Linux Gnome do. You cannot do 125% on Apple hardware (that's bad corner case; you need to display 8 framebuffer pixels using 5 physical pixels AND you pay the price for @2X framebuffer). The default mentioned above (1440x900@2X) means displaying 9 pixels from framebuffer using 8 physical pixels. It doesn't compound the error at all.
We may discuss whether Windows did it correctly and Mac not, but the fact on the ground is, that all Mac apps run correctly on fractionally scaled displays and Windows apps are mixed bag. Even those apps that do support HiDPI on Windows have weird bugs (I'm not going to name & shame).
By which we are getting to another one of your points: that apps can scale their assets. Sure, they can. And as we can see, every app will do it incorrectly in it's own unique way. So when every app does that, why wouldn't system library do it for them? Any bugs you will fix in one place, and you might find a way to hardware accelerate it in a way, that the respective apps couldn't do (see above about the scaler on the output encoder).
As I wrote, this solution is good enough quality wise, is simple to implement and brings results quickly. Actually, it so so good enough, that Apple is using it in iPhones too (on some models, the physical and logical resolution do not match. The logical resolution is an integer multiple one).
[1] For Intel hardware, you can find more info in the Programmer's Reference Manual, Volume 12: Display Engine.
I have both a 15" MacBook Pro (2018 model) and a Lenovo Thinkpad Carbon X1 open in front of me right now. I have 20/20 vision and it really is not striking me as that much better. I spend quite a bit of my day in a terminal and the remainder looking at the web, so font rendering definitely matters. Even with Linux's abysmal font-rendering, the mac really doesn't feel like it has the edge.
Wait, what? It's somewhere in the middle between Windows (too pixel-fitted) and macOS (too blurry). FreeType, slight hinting, LCD filtering on is the best font rendering that I know.
20/20 vision is only average for the population when you get to 60. So most people here will see significantly better than that. At 20/20 1080p is probably enough. Most people will enjoy a 1440p screen and 2160p will have marginal gains but it may be helpful to be able to use 2x scaling instead of needing fractional steps. But that depends on how you drive it.
Font rendering was matter on Low-DPI monitors but IMO not much matter for HiDPI monitors because some techniques like anti-aliasing is no more needed. Fonts are still important.
Hmmm, as an engineer, I have my vision checked every year. While a difference is discernible, it's not enough that I'd "miss it" if I didn't have a retina display. (I just ordered a Dell XPS 15 this weekend, I purposely left if at FHD as I didn't see the need for 4K on a laptop)
I'm on a three-year-old MBP right now. Next to me is my new IdeaPad Slim 7. Other than Slim's screen being a bit smaller, I hardly notice a difference.
I would have ordered an AMD Lenovo T14s if they hadn't artificially segmented it by not offering the 4K screen available on the Intel version. Hopefully AMD can increase the volume for these chips and more configurations show up.
> Our reviews generally leave extensive synthetic benchmarking to others
That left a really bad impression - I get that they are too lazy to actually measure the performance, but the snooty "synthetic" was uncalled for and frankly disrespectful to the people doing the work they are too lazy for.
Unfortunately this model seems to be sold out or perhaps even discontinued (until the European 4900U version becomes available maybe). Meanwhile, the E14 Gen2 14 is very similar in both specs and price.