Gemma4 in my view is good enough to do things similar to Gemini 2.5 flash, meaning if I point it code and ask for help and there is a problem with the code it’ll answer correctly in terms of suggestions but it’s not great at using all tools or one shooting things that require a lot of context or “expert knowledge”
If a couple more iterations of this, say gemma6 is as good as current opus and runs completely locally on a Mac, I won’t really bother with the cloud models.
Yep, and to be honest we don't really need local models for intensive tasks. At least yet. You can use openrouter (and others) to consume a wide variety of open models which are capable of using tools in an agentic workflow, close to the SOTA models, which are essentially commodities - many providers, each serving the same model and competing with each-other on uptime, throughput, and price. At some point we will be able to run them on commodity hardware, but for now the fact that we can have competition between providers is enough to ensure that rug pulls aren't possible.
Plus having Gemma on my device for general chat ensures I will always have a privacy respecting offline oracle which fulfils all of the non-programming tasks I could ever want. We are already at the point where the moat for these hyper scalers has basically dissolved for the general public's use case.
If I was OpenAI or Anthropic I would be shitting my pants right now and trying every unethical dark pattern in the book to lock in my customers. And they are trying hard. It won't work. And I won't shed a single tear for them.
Local models seem somewhere between 9 and 24 months behind. I'm not saying I won't be impressed with what online models will be able to do in two years, but I'm pretty satisfied with the prediction that I won't really need them in a couple of years.
A lot of people are making the mistake of noticing that local models have been 12-24 months behind SotA ones for a good portion of the last couple years, and then drawing a dotted line assuming that continues to hold.
It simply.. doesn't. The SotA models are enormous now, and there's no free lunch on compression/quantization here.
Opus 4.6 capabilities are not coming to your (even 64-128gb) laptop or phone in the popular architecture that current LLMs use.
Now, that doesn't mean that a much narrower-scoped model with very impressive results can't be delivered. But that narrower model won't have the same breadth of knowledge, and TBD if it's possible to get the quality/outcomes seen with these models without that broad "world" knowledge.
It also doesn't preclude a new architecture or other breakthrough. I'm simply stating it doesn't happen with the current way of building these.
edit: forgot to mention the notion of ASIC-style models on a chip. I haven't been following this closely, but last I saw the power requirements are too steep for a mobile device.
This is the classic apple approach - wait to understand what the thing is capable of doing (aka let others make sunk investments), envision a solution that is way better than the competition and then architect a path to building a leapfrog product that builds a large lead.
Pretty much it. That said, they did try to appease the markets by announcing 'Apple Intelligence' so they didn't appear to be behind everyone.
They did do the smart thing of not throwing too much capital behind it. Once the hype crumbles, they will be able to do something amazing with this tech. That will be a few years off but probably worth the wait.
For consumers AI has anti hype right now. It's off-putting to see consumer products slapped with a hundred AI labels. I see people talk about how you can turn off all of Apple Intelligence with one toggle rather than hundreds on Samsung.
Firefox is also marketing how easy it is to disable AI.
I think a lot of people are not hype about AI in their toaster, but... I don't think people are generally turned off form deeper integration in their OS itself. Especially when for some people this is representing ideas similar to how programmer-types get excited about Shortcuts.
Decently accessible automation and discovery, without having to go figure out a bunch of stuff
People like features, benefits, and outcomes. AI isn't a feature, it's a technology that can enable features. But it's being marketed as the only thing that matters.
The user does not give two shits if the new laptop "has AI". This is how Apple has been killing it lately, they market the macbooks being powerful, cheap, with long batteries, and a premium feel. Things the user cares about. Most of the stuff marketers are just blanket labeling "AI" will eventually be shuffled to the background and rebranded with a more specific term to highlight the feature being delivered rather than the fact it's
AI".
Yeah exactly the Apple Intelligence thing was pure BS to shut people up who kept saying apple was going to get disrupted by missing out.
Apple seems to follow the values that Steve laid out. Tim isn’t a visionary but he seems to follow the principles associated with being disciplined with cash quite well. They haven’t done any stupid acquisitions either. Quite the contrast with OAI.
Will this strategy work every time ? Maybe for AI it will work (market is competitive and Apple just purchases the best model for its consumers).
But this approach may not work in other areas: e.g. building electric batteries, wireless modems, electric cars, solar cell technology, quantum computing etc.
Essentially Apple got lucky with AI but it needs to keep investing in cutting edge technology in the various broad areas it operates in and not let others get too far ahead !
I think their M chips are a good example. They ran on intel for so long, then did the impossible of changing architecture on Mac, even without much transition pain.
Obviously that was built upon years of iPhone experience, but it shows they can lag behind, buy from other vendors, and still win when it becomes worth it to them.
It works often enough for the company to be wildly successful. They can simply cut their losses and withdraw from industries where it hasn't, such as EVs.
When have they done that since the first iPhone in 2007? The watch maybe? Though not sure that's "leapfrog" better than anyone else's smartwatch, but I don't have one so maybe I'm wrong.
> wait to understand what the thing is capable of doing
My parents use Android to ask “What are the 5 biggest towers in Chicago” or “Remove the people on my picture” while apparently iPhone is only capable of doing “Hey Siri start the Chronometer / There is no contact named Chronometer in your phone”.
My iPhone is lagging a ridiculous 10 years behind. It’s just that I don’t trust Google with my credit card.
It’s even more superpowered than previous implementations of this strategy.
When they made the iPhone, iPod, and Apple Watch they had no specific hardware advantage over competitors. Especially with early iPhone and iPod: no moat at all, make a better product with better marketing and you’ll beat Apple.
Now? Good luck getting any kind of reasonably priced laptop or phone that can run local AI as well as the iPhone/MacBook. It doesn’t matter that Apple Intelligence sucks right now, what matters is that every request made to Gemini is losing money and possibly always will.
This is especially true in 2026 where Windows laptops are climbing in price while MacBooks stay the same.
They're talking about free inference like Android and Google Home devices. No one is paying subscription fees for these and they're running their inference in the cloud. Apple Intelligence, for the most part, is running on the device.
I don't like companies forcing their newest features on me noisily and constantly trying to ship new features and see what sticks so you can't trust whether a feature advertised one week will even be there the next.
However, I have even less patience for companies forcing paid-for third-party ads down my throat on a paid product. Slack at least doesn't sell my eyeballs. Facebook, Twitter, Google's ads are worse to me than new feature dialogues.
Which brings me to Apple. I pay for a $1k+ device, and yet the app store's first result is always a sponsored bit of spam, adware, or sometimes even malware (like the fake ledger wallet on iOS, that was a sponsored result for a crypto stealer). On my other devices, I can at least choose to not use ad-ridden BS (like on android you can use F-Droid and AuroraStore, on Linux my package manager has no ads), but on iOS it's harder to avoid.
Apple hasn't sunk to Google levels in terms of ads, but they've crossed a line.
I get it but... well I think of App Store as... a store. I don't have to go there.
I'm actually pretty disappointed in the lack of discovery available in the App Store, but I rarely go there. I'm fine with advertising being there. I wish it was better but I'm not offended that there is paid promotion in a store.
>comes up with other banks, BankNames US app (not the country
you are in)
>revolut etc (cant use in the country you are in)
>ten minutes later
even worse when its your telecomm telling you to install their Official App so you can pay your bills or they will cut your cellular service, and you cant find it
As someone who recently moved to NL from the US I encounter this issue about once a week and it’s blocking me from doing serious things like paying for parking, taxes, utilities or government services, all of which have apps that are only available on the Dutch app store.
I have a separate Dutch Apple ID I can switch to, but each time I log out I risk accidentally deleting all my data.
I get an app recommendation from a friend, I go to the App Store and search for it. I have to be super careful about which link I'm actually clicking on and which app I'm installing, because the App Store is riddled with spam and malware.
I wouldn't mind, except that Apple charge 30% of everything with the justification that they are keeping the ecosystem free of spam and malware...
I’ve been installing apps from the App Store for more than a decade and have never ever accidentally downloaded spam or malware. I’m sure it’s there but it’s really not “riddled” with it in my experience searching for apps. What it’s riddled with is subscription-based apps whose free tier is worthless
I haven't noticed this at all and I wonder if you're mistaking curation for advertising? When I open up the App Store I get a panel written "games we love" and a listing of indie games that are clearly not paid for ads. The ads in search are visibly marked as ads, and while I don't particularly like ads in general, they are pretty easy to avoid.
Mine is Moneris Go, and the top review is titled "Garbage App!!!!" lol
Honestly the last time I remember using the App Store was years ago and I can't recall if they had ads or not. Imo it's distasteful and I wish they didn't have them. Still leagues better than the fucking ads in the start menu which caused me to give up on gaming and Windows forever.
If I open the app store and search "Gemini", the first result is "ChatGPT (advertisement)"
If I search for my bank, I get another bank. If I search for "Wordle", I get a bunch of ad-supported spamware (both the ad and non-ad results) before the real NYT Games app.
The app store has ads in search results. This is the primary way that my technologically inept relatives end up with the wrong app installed btw, is by searching and clicking the first result, and getting complete trash adware.
Apple should be ashamed of selling out their users.
Apple aren’t in the business of building chatbots to impress investors (other than some WWDC2024 vaporware they’d rather not talk about any more). They’re in the business of consumer hardware.
Consumers want iPhones and (if Apple are right) some form of AR glasses in the next decade. That’s their focus. There’s a huge amount of machine learning and inference that’s required to get those to work. But it’s under the hood and computed locally. Hence their chips. I don’t see what Apple have to gain by building a competitor to what OpenAI has to offer.
~25% of Apple's revenue came from services in FY25 (and 50% from iPhone, ~25% from other hardware). They made $415B in that year, so ~$100B from services alone!
Nvidia restricts gamer cards in data centers through licensing, eventually they will probably release a cheaper consumer AI card to corner the local AI market that can't be used in data centers if they feel too much of a threat from Apple.
Imagine a future where Nvidia sells the exact same product at completely different prices, cheap for those using local models, and expensive for those deploying proprietary models in data centers.
[WSJ] sources expect.. first units in H1 2026, with GTC as the most likely unveiling stage.. NPU reportedly exceeds both Intel and AMD’s current neural processing units.. If the integrated GPU delivers RTX 5070-class performance in a thin laptop form factor, it would eliminate the need for a separate GPU die, fundamentally changing how gaming laptops are designed.
Apple is almost 2 years out from their announcement of Apple Intelligence. It has barely delivered on any of the hype. New Siri was delayed and barely mentioned in the last WWDC; none of the features are released in China.
In other news, people keep buying iPhones, and Apple just had its best quarter ever in China. AAPL is up 24% from last year.
Indeed, a lot of the people that bought iPhones are now buying Macs with a binned version of the chip they already bought. So much so that Apple is in danger of running out of them.
This seems mistaken to me. The core idea is that LLMs are commoditizing and that the UI (Siri in this case) is what users will stick with.
But... what's the argument that the bulk of "AI value" in the coming decade is going to be... Siri Queries?! That seems ridiculous on its face.
You don't code with Siri, you don't coordinate automated workforces with Siri, you don't use Siri to replace your customer service department, you don't use Siri to build your documentation collation system. You don't implement your auto-kill weaponry system in Siri. And Siri isn't going to be the face of SkyNet and the death of human society.
Siri is what you use to get your iPhone to do random stuff. And it's great. But ... the world is a whole lot bigger than that.
> Pure strategy, luck, or a bit of both? I keep going back and forth on this, honestly, and I still don’t know if this was Apple’s strategy all along, or they didn’t feel in the position to make a bet and are just flowing as the events unfold maximising their optionality.
Maximizing the available options is in fact a "strategy", and often a winning one when it comes to technology. I would love to be reminded of a list of tech innovators who were first and still the best.
> Then Stargate Texas was cancelled, OpenAI and Oracle couldn’t agree terms, and the demand that had justified Micron’s entire strategic pivot simply vanished. Micron’s stock crashed.
Well.. no. The Stargate expansion was cancelled the orginally planned 1.2MW (!) datacenter is going ahead:
> The main site is located in Abilene, Texas, where an initial expansion phase with a capacity of 1.2 GW is being built on a campus spanning over 1,000 acres (approximately 400 hectares). Construction costs for this phase amount to around $15 billion. While two buildings have already been completed and put into operation, work is underway on further construction phases, the so-called Longhorn and Hamby sections. Satellite data confirms active construction activity, and completion of the last planned building is projected to take until 2029.
> The Stargate story, however, is also a story of fading ambitions. In March 2026, Bloomberg reported that Oracle and OpenAI had abandoned their original expansion plans for the Abilene campus. Instead of expanding to 2 GW, they would stick with the planned 1.2 GW for this location. OpenAI stated that it preferred to build the additional capacity at other locations. Microsoft then took over the planning of two additional AI factory buildings in the immediate vicinity of the OpenAI campus, which the data center provider Crusoe will build for Microsoft. This effectively creates two adjacent AI megacampus locations in Abilene, sharing an industrial infrastructure. The original partnership dynamics between OpenAI and SoftBank proved problematic: media reports described disagreements over site selection and energy sources as points of contention.
But why do I feel like the quality of the software from Apple declined sharply in recent years? The liquid glass design feels very unpolished and not well thought out throughout almost everywhere… seems like even Apple can’t resist falling victim to AI slop
I don’t think it’s AI slop. Even before modern generative AI, I’ve noticed a decline in Apple’s software quality.
Rather, I feel that Apple has forgotten its roots. The Mac was “the computer for the rest of us,” and there were usability guidelines backed by research. What made the Mac stand out against Windows during a time when Windows had 95%+ marketshare was the Mac’s ease of use. The Mac really stood out in the 2000s, with Panther and Tiger being compelling alternatives to Windows XP.
I think Apple is less perfectionistic about its software than it was 15-20 years ago. I don’t know what caused this change, but I have a few hunches:
0. There’s no Steve Jobs.
1. When the competition is Windows and Android, and where there’s no other commercial competitors, there’s a temptation to just be marginally better than Windows/Android than to be the absolute best. Windows’ shooting itself in the foot doesn’t help matters.
2. The amazing performance and energy efficiency of Apple Silicon is carrying the Mac.
3. Many of the people who shaped the culture of Apple’s software from the 1980s to the 2000s are retired or have even passed away. Additionally, there are not a lot of young software developers who have heard of people like Larry Tesler, Bill Atkinson, Bruce Tognazzini, Don Norman, and other people who shaped Apple’s UI/UX principles.
4. Speaking of Bruce Tognazzini and Don Norman, I am reminded of this 2015 article (https://www.fastcompany.com/3053406/how-apple-is-giving-desi...) where they criticized Apple’s design as being focused on form over function. It’s only gotten worse since 2015. The saving grace for Apple is that the rest of the industry has gone even further in reducing usability.
I think what it will take for Apple to readopt its perfectionism is if competition forced it to.
Apple will just drip feed locally running models that enable minor conveniences. They will probably drop the Apple Intelligence label later and just have things with their own names like "magic eraser".
I like how we are acting like this market is so novel and emergent revering the luck of some while lamenting the failures of others when it was all "roadmapped" a decade ago. It's like watching a Shaanxi shadow puppet show with artificial folk lore about the origins of the industry. I hate reality television!
If a couple more iterations of this, say gemma6 is as good as current opus and runs completely locally on a Mac, I won’t really bother with the cloud models.
That’s a problem.
For the others anyway.
Plus having Gemma on my device for general chat ensures I will always have a privacy respecting offline oracle which fulfils all of the non-programming tasks I could ever want. We are already at the point where the moat for these hyper scalers has basically dissolved for the general public's use case.
If I was OpenAI or Anthropic I would be shitting my pants right now and trying every unethical dark pattern in the book to lock in my customers. And they are trying hard. It won't work. And I won't shed a single tear for them.
It simply.. doesn't. The SotA models are enormous now, and there's no free lunch on compression/quantization here.
Opus 4.6 capabilities are not coming to your (even 64-128gb) laptop or phone in the popular architecture that current LLMs use.
Now, that doesn't mean that a much narrower-scoped model with very impressive results can't be delivered. But that narrower model won't have the same breadth of knowledge, and TBD if it's possible to get the quality/outcomes seen with these models without that broad "world" knowledge.
It also doesn't preclude a new architecture or other breakthrough. I'm simply stating it doesn't happen with the current way of building these.
edit: forgot to mention the notion of ASIC-style models on a chip. I haven't been following this closely, but last I saw the power requirements are too steep for a mobile device.
They did do the smart thing of not throwing too much capital behind it. Once the hype crumbles, they will be able to do something amazing with this tech. That will be a few years off but probably worth the wait.
Firefox is also marketing how easy it is to disable AI.
Decently accessible automation and discovery, without having to go figure out a bunch of stuff
The user does not give two shits if the new laptop "has AI". This is how Apple has been killing it lately, they market the macbooks being powerful, cheap, with long batteries, and a premium feel. Things the user cares about. Most of the stuff marketers are just blanket labeling "AI" will eventually be shuffled to the background and rebranded with a more specific term to highlight the feature being delivered rather than the fact it's AI".
Apple seems to follow the values that Steve laid out. Tim isn’t a visionary but he seems to follow the principles associated with being disciplined with cash quite well. They haven’t done any stupid acquisitions either. Quite the contrast with OAI.
But this approach may not work in other areas: e.g. building electric batteries, wireless modems, electric cars, solar cell technology, quantum computing etc.
Essentially Apple got lucky with AI but it needs to keep investing in cutting edge technology in the various broad areas it operates in and not let others get too far ahead !
Obviously that was built upon years of iPhone experience, but it shows they can lag behind, buy from other vendors, and still win when it becomes worth it to them.
- Apple Watch
- AirTag
Those are a few that come to mind. All do multi-billions in revenue per year.
My parents use Android to ask “What are the 5 biggest towers in Chicago” or “Remove the people on my picture” while apparently iPhone is only capable of doing “Hey Siri start the Chronometer / There is no contact named Chronometer in your phone”.
My iPhone is lagging a ridiculous 10 years behind. It’s just that I don’t trust Google with my credit card.
The only reason to care about it being OS integrated is to interact with functions of the OS, which siri does fine.
When they made the iPhone, iPod, and Apple Watch they had no specific hardware advantage over competitors. Especially with early iPhone and iPod: no moat at all, make a better product with better marketing and you’ll beat Apple.
Now? Good luck getting any kind of reasonably priced laptop or phone that can run local AI as well as the iPhone/MacBook. It doesn’t matter that Apple Intelligence sucks right now, what matters is that every request made to Gemini is losing money and possibly always will.
This is especially true in 2026 where Windows laptops are climbing in price while MacBooks stay the same.
It's not. People make this claim with zero evidence.
But Google made around $20B profit on Google search in 2025 Q4, and that includes AI search.
In hindsight it’s obvious why they pulled it off - nobody else could do it. They all had pieces missing.
When I open up JIRA or Slack I am always greeted with multiple new dialogues pointing at some new AI bullshit, in comparison. We hates it precious
However, I have even less patience for companies forcing paid-for third-party ads down my throat on a paid product. Slack at least doesn't sell my eyeballs. Facebook, Twitter, Google's ads are worse to me than new feature dialogues.
Which brings me to Apple. I pay for a $1k+ device, and yet the app store's first result is always a sponsored bit of spam, adware, or sometimes even malware (like the fake ledger wallet on iOS, that was a sponsored result for a crypto stealer). On my other devices, I can at least choose to not use ad-ridden BS (like on android you can use F-Droid and AuroraStore, on Linux my package manager has no ads), but on iOS it's harder to avoid.
Apple hasn't sunk to Google levels in terms of ads, but they've crossed a line.
I'm actually pretty disappointed in the lack of discovery available in the App Store, but I rarely go there. I'm fine with advertising being there. I wish it was better but I'm not offended that there is paid promotion in a store.
>"to fix this, please install our app"
>search BankName
>comes up with other banks, BankNames US app (not the country you are in)
>revolut etc (cant use in the country you are in)
>ten minutes later
even worse when its your telecomm telling you to install their Official App so you can pay your bills or they will cut your cellular service, and you cant find it
I have a separate Dutch Apple ID I can switch to, but each time I log out I risk accidentally deleting all my data.
I get an app recommendation from a friend, I go to the App Store and search for it. I have to be super careful about which link I'm actually clicking on and which app I'm installing, because the App Store is riddled with spam and malware.
I wouldn't mind, except that Apple charge 30% of everything with the justification that they are keeping the ecosystem free of spam and malware...
For me, the second tile is an ad for Upside, some cashback app
Honestly the last time I remember using the App Store was years ago and I can't recall if they had ads or not. Imo it's distasteful and I wish they didn't have them. Still leagues better than the fucking ads in the start menu which caused me to give up on gaming and Windows forever.
If I search for my bank, I get another bank. If I search for "Wordle", I get a bunch of ad-supported spamware (both the ad and non-ad results) before the real NYT Games app.
The app store has ads in search results. This is the primary way that my technologically inept relatives end up with the wrong app installed btw, is by searching and clicking the first result, and getting complete trash adware.
Apple should be ashamed of selling out their users.
Consumers want iPhones and (if Apple are right) some form of AR glasses in the next decade. That’s their focus. There’s a huge amount of machine learning and inference that’s required to get those to work. But it’s under the hood and computed locally. Hence their chips. I don’t see what Apple have to gain by building a competitor to what OpenAI has to offer.
Imagine a future where Nvidia sells the exact same product at completely different prices, cheap for those using local models, and expensive for those deploying proprietary models in data centers.
In other news, people keep buying iPhones, and Apple just had its best quarter ever in China. AAPL is up 24% from last year.
Here's to another 10 years of scuffed Metal Compute Shaders, I guess.
But... what's the argument that the bulk of "AI value" in the coming decade is going to be... Siri Queries?! That seems ridiculous on its face.
You don't code with Siri, you don't coordinate automated workforces with Siri, you don't use Siri to replace your customer service department, you don't use Siri to build your documentation collation system. You don't implement your auto-kill weaponry system in Siri. And Siri isn't going to be the face of SkyNet and the death of human society.
Siri is what you use to get your iPhone to do random stuff. And it's great. But ... the world is a whole lot bigger than that.
Maximizing the available options is in fact a "strategy", and often a winning one when it comes to technology. I would love to be reminded of a list of tech innovators who were first and still the best.
Anyway, hasn't this always been Apple's strategy?
This was really unsurprising [0].
[0] https://news.ycombinator.com/item?id=40278371
Well.. no. The Stargate expansion was cancelled the orginally planned 1.2MW (!) datacenter is going ahead:
> The main site is located in Abilene, Texas, where an initial expansion phase with a capacity of 1.2 GW is being built on a campus spanning over 1,000 acres (approximately 400 hectares). Construction costs for this phase amount to around $15 billion. While two buildings have already been completed and put into operation, work is underway on further construction phases, the so-called Longhorn and Hamby sections. Satellite data confirms active construction activity, and completion of the last planned building is projected to take until 2029.
> The Stargate story, however, is also a story of fading ambitions. In March 2026, Bloomberg reported that Oracle and OpenAI had abandoned their original expansion plans for the Abilene campus. Instead of expanding to 2 GW, they would stick with the planned 1.2 GW for this location. OpenAI stated that it preferred to build the additional capacity at other locations. Microsoft then took over the planning of two additional AI factory buildings in the immediate vicinity of the OpenAI campus, which the data center provider Crusoe will build for Microsoft. This effectively creates two adjacent AI megacampus locations in Abilene, sharing an industrial infrastructure. The original partnership dynamics between OpenAI and SoftBank proved problematic: media reports described disagreements over site selection and energy sources as points of contention.
https://xpert.digital/en/digitale-ruestungsspirale/
> Micron’s stock crashed. [the link included an image of dropping to $320]
Micron’s stock is back to $420 today
> One analysis found a max-plan subscriber consuming $27,000 worth of compute with their 200$ Max subscription.
Actually, no. They'd miscalculated and consumed $2700 worth of tokens.
The same place that checked that claim also points out:
> In fact, Anthropic’s own data suggests the average Claude Code developer uses about $6 per day in API-equivalent compute.
https://www.financialexpress.com/life/technology-why-is-clau...
I like Apple's chips, but why do we put up with crappy analysis like this?
Rather, I feel that Apple has forgotten its roots. The Mac was “the computer for the rest of us,” and there were usability guidelines backed by research. What made the Mac stand out against Windows during a time when Windows had 95%+ marketshare was the Mac’s ease of use. The Mac really stood out in the 2000s, with Panther and Tiger being compelling alternatives to Windows XP.
I think Apple is less perfectionistic about its software than it was 15-20 years ago. I don’t know what caused this change, but I have a few hunches:
0. There’s no Steve Jobs.
1. When the competition is Windows and Android, and where there’s no other commercial competitors, there’s a temptation to just be marginally better than Windows/Android than to be the absolute best. Windows’ shooting itself in the foot doesn’t help matters.
2. The amazing performance and energy efficiency of Apple Silicon is carrying the Mac.
3. Many of the people who shaped the culture of Apple’s software from the 1980s to the 2000s are retired or have even passed away. Additionally, there are not a lot of young software developers who have heard of people like Larry Tesler, Bill Atkinson, Bruce Tognazzini, Don Norman, and other people who shaped Apple’s UI/UX principles.
4. Speaking of Bruce Tognazzini and Don Norman, I am reminded of this 2015 article (https://www.fastcompany.com/3053406/how-apple-is-giving-desi...) where they criticized Apple’s design as being focused on form over function. It’s only gotten worse since 2015. The saving grace for Apple is that the rest of the industry has gone even further in reducing usability.
I think what it will take for Apple to readopt its perfectionism is if competition forced it to.