An easily visible one is air intakes. Many manufacturers have shifted to plastic. Peteo-engineering has advanced a lot, but they will still get brittle and break.
Interior wise, you can look at things like fabric durability-- lower deniers can be cheaper, but will wear sooner. Springs/foam in seats are another example, but this will vary across manufacturers, models and trims.
This isn't exclusive to financial engineering manufacturers like Stellantis or Nissan, either. Toyota has had issues with simple things like rust proofing (whether intentional or not) on 1st generation Tacomas leading to massive recalls and things like plastic timing guides prone to wearing out. Ford with the wet clutches having belts submersed in oil.
German cars needing body off access for rear timing chain maintenance at 80k miles. Water cooled alternators (really, VW?). All types of "why?" if you follow cars once they are 3+ years old.
It seems like there are a lot of regressions that probably result from cost cutting, while others may exist to simply drive service revenue.
In the United States, the Environmental Protection Agency assumes the typical car is driven 15,000 miles (24,000 km) per year. According to the New York Times, in the 1960s and 1970s, the typical car reached its end of life around 100,000 miles (160,000 km). Due in part to manufacturing improvements, such as tighter tolerances and better anti-corrosion coatings, in 2012 the typical car was estimated to last for 200,000 miles (320,000 km) with the average car in 2024 lasting 160,545 miles according to the website Junk Car Reaper.
I think you're talking about apples and oranges, as parent appeared to be cataloguing recent design defects. Which are pretty common too.
That'll influence the average reliability minimally, unless you were unlucky enough to buy one of those models.
Personally, why I'd rather get something at 120k mileage w/ 250k+ max examples on the road by that calendar date. You'll know whether they designed a lemon.
Add: undersized Tacoma rear leaf springs, multiple manufacturers' head gaskets, a few early aluminum engines (? from memory)
There are many other considerations, too. Years ago I scraped Craigslist and Autotrader, grouping cars by generation/make/model/drivetrain to be able to predict longevity based on quantity for sale versus original sales figures. If a model sold 100k per year for 10 years and only 3 were for sale in year 13, that isn't a great sign. Cheap cars will tend to have cheap owners who are more likely to skimp on maintenance, typically leading to more accrued issues and a shorter lifespan for the vehicle. Some cars are just poorly engineered, and the markets are relatively efficient in pricing resale value. The definition of "high mileage" is going to vary by who you ask. Domestics 150k, German 80k, Japanese 200k, Korean 100k. These are subjective averages (some cars like Theta engines, Darts, even late model GM 6.2s have engine failures <40k), based on when they start disappearing due to repairs being more than the vehicle is worth, but based on what I saw then and kind of observe still.
Leaning on those prior mentioned product mixes, keep in mind that Japanese manufacturers weren't in the American market 60 years ago, so market mix would be wildly different. (Multiple 400k+ mi Toyotas in my family, along with 60 year old GMs, but with aftermarket or rebuilt engines.) The cost of vehicles (and repairs) relative to prevailing wages will impact the repair vs replace balance. Trade publications like Cox/NADA/Adesa/etc. are always cited by financial blogs when mentioning consumer spending/state of economy by average age of cars on the road. Why cars get junked or totaled has shifted drastically, too. Steel bumpers were easy to replace, modern bumper covers with styrofoam backing and aluminum crumple zones, not so much. Tolerances is a vague term in that veiled PR piece on that wiki article. Machining has improved. Tech like direct injection and improved lubrication (synthetics) have done much more in terms of efficiency and longevity. In a lot of cases, manufacturers try to get more and more horsepower from the same displacement by pushing tighter engine tolerances (crank/main bearings, pistons/rings, valvetrain) and things like higher compression ratios and revs, leading to more heat and earlier failure. So while you have better initial engineering, you are closer to the point of failure. For another example, interference engines will grenade themselves if you ignore timing belt maintenance, but in the meantime, you get more horsepower by getting more air into the cylinders.
A v6 Camry or Accord is going to be have more hp, be faster,more reliable at same age, be quieter and get 3x the mpg than nearly any muscle car of the past.
Unfortunately it seems that many Americans prefer giant vehicles that place more emphasis on their size (and status) than materially important factors like reliability engineering or fuel economy.
Obviously these are ancedotal examples, they can be confirmed by wasting hours reading about cars and watching mechanic review videos from people who work on them daily (I am partial to the CarCareNut on YT).
They're even here. Lots of very suspicious comments from accounts created <90 days and many accounts created after 2024ish tend to also align similarly, but with farmed karma.
The educational and informational queries were always the least valuable from a monetization standpoint. Chegg Answers could rank for these low competition (also low commercial intent) terms-- think queries like phrases from textbooks students would be querying. There is virtually 0 way (for people besides Chegg) to monetize these types of queries. Now Google can answer these queries directly, albeit with the assumption it costs them slightly more to serve these AI responses than a search query.
AI overviews are breaking the implicit "contract" for informational sites-- "we will create content to rank on Google with the expectation of monetization via display ads, mailing list growth and/or sales commissions of some sort." If these sites now lose 90% of their traffic, they simply go extinct. We have already seen the destruction of the old web era sites and the walled gardens being built. How many new sites, at the same frequency as 15 years ago, 1) get built and 2) get visibility without relying on one of the fickle walled gardens for an audience?
Google will probably figure out a way to monetize these informational queries by building better profiles of users. Or most likely, they start slipping in commercially biased responses-- either natively or disclosed, but probably based on all user conversations instead of the current one.
Look at where Fanduel/Draftking/Caesars type sportsbooks make their most margin-- it is parlays. Probably 95% of people wagering on these sites don't have even a tenuous grasp on basic statistics, yet alone how to derive actual probabilities of their action for simple spread/moneyline/total wagers. When you're letting them combine 5 wagers each with an EV of 90 cents on the dollar, the books are loving it. Layer on that these books simply ban winning players, it is insanely predatory.
Prediction markets, as they currently stand, are at least better with regard to having a lower take and are less predatory in their wagering products and marketing (although these points can very easily change, but the complex wagering menus will be less liquid and harder to grow). If the house cut is 1-3%, that is still drastically better than the other parimutuel wagering in America, horse racing, which is typically 20-25%.