Only the T and X series benefit from the Japanese design studios though and have the build quality to match. The E and L series are indistinguishable from a myriad of bargain bin business laptops, including Lenovo's own ideapads.
1) That the entering of LLMs onto the scene of communication implies that real human beings need to change their style as a result.
2) That nobody can make an LLM talk like Cleetus McFarland.
To me, "I know that text is AI-generated" accusation smacks of the "We can always tell" discourse in the transphobia space. It's untrue, distasteful, and rude.
"just develop a personality" sounds like a shallow dismissal. Most comments in most threads could theoretically be autogenerated when given style samples of what fits on HN and what opinion to use
A personality hardly shows through in a handful of sentences, besides which, I'd rather judge comments by merit than by the personality of the poster (hacker ethics, point number 4: https://en.wikipedia.org/wiki/Hacker_ethic#The_hacker_ethics)
It's not just AI-generated articles -- it's the other things that we delve into as a result. Listicles. Comments. Posts. It's what it means to be human, and honestly? That's rare.
It's a much better QOL thing I've found to just ssh into a remote Linux box from a Mac. The BSD stuff on macOS isn't bad at all, just an adjustment... and homebrew lets you get your environment however you'd like.
I am curious how long Apple is going to continue to support XQuartz though. There seems to be no equivalent wayland project.
I use UTM, it's simple and seems light. I can share my source directory with the VM so I can edit using macos pycharm, and test the containers in the VM.
What really caught me out was I downloaded an x64 image once (there was no arm64 image) and it somehow just ran anyway in the arm64 VM. That may have been some qemu magic?
I love the macos/virtualised linux dev workflow, but is isn't better than plain linux. I'm just still not convinced GUI stuff works on linux as well as it does on macos and macbook hardware is so nice (if you're not paying for it).
Well the costs had to be cut somewhere. At least they put a headphone jack in it, so they're doing better than Microsoft on that front (who inexplicably removed it from the SP line)
I don't think this is intentional to cut cost. I simply think that the chip was primarily made for devices with one port (iPhone, iPad) and this is a bit of an afterthought.
I wouldn't be surprised to see a future product with 2x USB 3.0 10 Gbps with DisplayPort support on the next generation, A19 Pro or A20 Pro maybe, if the product has enough success.
This is going to be a primary complaint people have (even if it's not terribly important) - hopefully they have some circuitry that warns you if you're plugging something into the wrong port (e.g. a USB 3 device into USB 2 at slow speeds).
> This is going to be a primary complaint people have
No. Most people never plug in anything to their USB ports where they'd notice a speed difference. Definitely not people picking up a $600 MacBook for school or casual web browsing.
I'd bet 90% of folks never do anything other than charge through these ports...
I don't think people will have many complaints about this thing, but I do think this will be one of the primary (even though it's basically a non-complaint).
It will definitely be used to justify spending $300-1500 more for a better laptop.
The same things Macbook pro users plug in where they'd see the USB speed differences. Just because someone isn't as privileged as to be able to afford a MacBook Pro instead of a Neo doesn't mean they don't have the same dreams and desires.
For monitors at least, if you plug in to the wrong port you will get a GUI prompt advising you to use the other one [1]:
> while the ports aren’t labeled, if you plug an external display into the “wrong” port, you’ll get an on-screen notification suggesting you plug it into the other port.
I hope they added (or will add) that feature to other Macs too, on mine I had to try out different ports and check the settings to find the one which can go beyond 60Hz.
If they want to get these things into schools it would be insane to expect the schools to also supply everyone with AirPods or some other kind of wireless headphones.
AirPods have too much latency for playing music. You want wired audio for running GarageBand or Logic Pro with a MIDI controller. They could have gone with a USB-C to audio adapter, but then you wouldn't be able to plug the MIDI controller and charge the computer.
... Only a few people make music with a Mac, but it's been an important part of its history, and Apple cares about it.
> ... Only a few people make music with a Mac, but it's been an important part of its history, and Apple cares about it.
This seems to be a recent phenomenon. A lot of electronic music production uses the Macintosh and Logic/Ableton workflow, to say nothing about how many of the best DSPs were Apple-exclusive until about a decade ago. I don't really think music production, at least in the EDM and hip-hop world, got popular on the PC until the rise of Fruity Loops and FL Studio, but that's available on the Mac now too.
This is 2026. iPhones use standard USB C headphones, you can charge your phone at the same time while using your wired headphones using MagSafe and you can even by low end $59 Beats Flex headphones that have all of the Apple magic.
I’m going to need HN geeks to get over analog headphones from the 60s
I've never had a USB-C port fail with many of them being plugged / unplugged multiple times a day for years. At most they fill with dust you have to fish out. Aux ports would often get in a state where you had to very carefully position the jack for it to work.
I am a huge 3.5mm jack defender and I am still upset at how Apple created a post-USB C world. But this is a common misconception.
USB C headphones and 3.5mm headphones (and Bluetooth, USB A, etc) are all equally as "analog" as one another (with the exception of someone with all-analog equipment, of course).
You need a DAC somewhere between the chip you're getting the digital signal from and the speakers that are playing an analog signal. And so the quality of that depends on (among other things) the quality of your DAC.
With USB or Bluetooth headphones, the DAC is somewhere in the headphone. With the 3.5mm jack, the DAC is behind jack. If you have a device with a crummy built-in DAC giving you a noisy signal, you'll be better off using a USB DAC.
I haven't used Apple's USB C earbuds, but Apple does make a $10 USB C to 3.5mm DAC that performs very very well for its price point.
The difference is you always can buy USB C headphones with a known good consistent DAC. A 3.5 inch headphone jack serves no purpose in the age of USB C - even my wife’s mixing board has USB C input that she can plug her iPhone into.
Next thing HN folks are going yo want the iPhone to come with a SCSI port.
And technology moves on either way. There is not a single high end phone that still comes with a 3.5 inch headphone jack in 2026. The number of people who care in 2026 is probably less than the number of people who want to run Linux on their phone.
Yes, but that's different than what we're saying. I think many more people want and use 3.5mm jacks than they do SCSI ports. The 3.5mm jack is excellent. We're in a thread about a new device released with this wonderful port.
Also, many people want to run Linux on their phone. About 7 in 10 smart phones run Linux, and smart phones are devices billions of humans use every day.
We are in a thread on HN where you have people who complain about not having root access on your iPhone, want to run Linux on everything and bemoan the fact that most websites don’t work with JavaScript disabled.
This is as far from the mainstream as you can possibly get.
Come September it will have been a decade since Apple dropped the headphone port - the world has moved on
I would very much like root on my phone and most of the websites I use don't require JavaScript. Apple hasn't dropped the headphone port, they even announced a new product today called the Macbook Neo with one. There is even a thread on HN about it :)
Or I can just not do stupid shit and listen to hifi headphones released in the past 2-3 years, many of which have a 3.5mm jack (and adapters for larger, if plugging into dac/pre-amps).
Which you said aren't being made anymore. Which is factually untrue. The best bit is, they're still being made! And there's plenty of people who are still buying them!
Why? Because a $170 pair of closed-backs sounds infinitely better than the $550 Bose Quiet Comfort Ultra nonsense.
FiiO FT1 32Ω being a prime example, if you are looking for closed back suggestions :^)
No I said high end phones are no more coming with headphone jacks than they are coming with SCSI and VGA ports. I’m sure it would be convenient for you if the iPhone came with a right and left 1/4 inch audio jack.
Why do they need to sound better? Also, in a lot of instances, they do sound better because they can offer powered functionality such as ANC. Can’t get that with a truly analog headphone. I’d never use analog headphones on a plane, for instance.
Low-end wired earbuds come in packages with dozens of units. I buy cheap earbuds because my kids love breaking them. Not everyone optimizes for the same thing. Analog remains the bees knees in certain settings.
The Blue Origin skepticism is based in how many decades they spent in making buildings instead of rockets and how long it has taken them to get anything to orbit.
> So, autocomplete done by deterministic algorithms in IDEs are okay but autocomplete done by LLM algorithms - no, that's banned? Ok, surely everybody agrees with that, it's policy after all.
Because autocomplete still requires heavy user input and a SWE at the top of the decision making tree. You could argue that using Claude or Codex enables you to do the same thing, but there's no guarantee someone isn't vibecoding and then not testing adequately to ensure, firstly, that everything can be debugged, and secondly, that it fits in with the broader codebase before they try to merge or PR.
Plenty of people use Claude like an autocomplete or to bounce ideas off of, which I think is a great use case. But besides that, using a tool like that in more extreme ways is becoming increasingly normalized and probably not something you want in your codebase if you care about code quality and avoiding pointless bugs.
Every time I see a post on HN about some miracle work Claude did it's always been very underwhelming. Wow, it coded a kernel driver for out of date hardware! That doesn't do anything except turn a display on... great. Claude could probably help you write a driver in less time, but it'll only really work well, again, if you're at the top of the hierarchy of decision making and are manually reviewing code. No guarantees of that in the FOSS world because we don't have keyloggers installed on everybody's machine.
Yes, actually. Knowingly violating the policies of a project while pretending you aren't, so you can continue participating in the fully voluntary project, does make you a jerk.
If you don't like the policies they set, just leave.
I'm willing to bet that every single person on here complaining has zero contributions to PostmarketOS.
If you only knew how the enterprise space does stuff you'd realize how little a priority maintainability is.
I'm grateful we had Java when this stuff was taking off; if any enterprise applications were written in anything else available at the time (like C/C++) we'd all suffer even more memory leaks, security vulnerabilities, and data breaches than we do now.
Now that's interesting, because I come from a world where enterprise level stuff was all done in C/C++ until quite recently, and with the shift to :web technologies" the quality of virtually everything has dropped through the floor, including the knowledge and skill level of the developers working on the tech. It is rare that I see people that have been working in excess of 10 years post graduation, if they went to college. The college grads have been pushed out by lower quality and lower skilled React developers that really do not belong in the industry at all. It's really a crime how low things have gotten, in such a short time: 10 to 15 years ago there were 2-3 decades of experienced people all over the place. Not anymore.
Except this is not the age of the Rockefellers or the Carnegies, who, despite being far more philanthropic than modern-day billionaires, drew ire from every corner of society for their wealth accumulation. It wasn't until the New Deal that the balance shifted.
Unconstrained accumulation of capital into the hands of the few without appropriate investment into labor is illiberal and incompatible with democracy and true freedom. Those of us who are capitalists see surplus value as a compromise to ensure good economic growth. The hidden subtext of that is that all the wealth accumulated needs to be re-allocated to serve not only capital enterprise, but the needs of society as a whole. It's hard to see the current system as appropriate for that given how blindly and wildly investments are made with no DD or going long, or no effort paid to the social or environmental opportunity costs of certain practices.
A lot of this comes down to the crippling of the SEC and FTC, but even then, investors cry and whine every time you suggest reworking the regs to inhibit some of the predatory practices common in this post-80s era of hypernormalization. Our current system does not resemble a healthy capitalist economy at all. It's rife with monopsony and monopolistic competition, inequality of opportunity, and a strained underclass that's responsible for our inverted population pyramid -- how can you have kids when we're so atomized and there is no village to help you? You can raise kids in a nuclear family if and only if you have enough money to do so. Otherwise, historically, people relied on their communities when raising children in less-than-ideal circumstances. Those communities are drying up.
> Those of us who are capitalists see surplus value as a compromise to ensure good economic growth.
I think the problem is that every system of economics requires ignoring human nature in order to believe it possibly can work. In order to believe that capitalism doesn't lead to despotic rule you have to ignore the fact that civilizations love a good hierarchy far more than they love justice and fairness.
You can make any system of economics work if you figure out how to deal, head on, with the particular human nature factor that it tries to ignore.
I don't really believe that the strongman theory and hierarchy is inherent to human social structure, or at least not the way in which we do it. Some level of hierarchy is inevitable, but the longest-lasting and most stable hierarchies were somewhat bureaucratic and highly meritocratic (think China and their civil service exams) and our system is extremely bureaucratic and not meritocratic whatsoever.
reply