One of the most pernicious sins of the smartphone revolution was to rebrand the "computer". We'd evolved expectations as to what we were entitled to on our own machines - yes, even those with a proprietary OS - and then with one weird trick, it all went in the trash. Advertising, surveillance, inability to install whatever you want - before smartphones, this sort of thing used to be unequivocally malware.
The worst part is that now the dam's broken, it's all bleeding back into "computers".
You're blaming smartphones for problem that exists because of the Internet. If smartphones didn't render computers irrelevant to most people, all this "bad app" stuff would have happened on computers.
And the computers that don't have "bad apps" are hacked and encrypted by ransomware. Is that better?
> The worst part is that now the dam's broken, it's all bleeding back into "computers".
Back in the day, before cellular phones, mainframe and minicomputer vendors tried their best to restrict the software that could be run on “their” hardware. They realised that they could squeeze the most out of their customers that way.
Android is also Linux. Yet some manufacturers don't allow any kind of boot unlocking or root access.
Open source doesn't guarantee we have any type of control over it if the hardware is locked down. The reason is more that the industry hasn't bothered selling locked-down computers yet by default. The tech is there, Secure Boot + TPM. They just give us the options to override in the BIOS. For now.
Linux has many more stakeholders than you. It’s a failure for you, but extremely important to many others (including me). I’d go further and allege you are in the minority among users of Linux, because think about what a user of Linux is. Linus made the absolutely correct call and Linux would be on the decline if he hadn’t; the money in Linux is in embedding (and SaaS), not people dabbling with free software who by definition don’t pay for it, and until the FSF and its adherents conquer the concept of “an economy” they’re still fighting an uphill, losing battle against the same economic forces they despite.
More than half of YC’s hardware startups, not to mention Android and probably Teslas in their current form, probably would have never happened were they not able to embed Linux in a controlled manner without having to invest in catering to the four total users who will want to build a system image and reflash the firmware on their whatever. (Android has some means to do so and an audience much more interested in doing so but the point stands.)
I’m also interested in the legal framework around a software license that’s able to dictate the architecture and design of components around the software, and I suspect it will not survive if challenged, particularly in Europe.
The greatest failure of free software, to me, is thinking in absolutes and not studying how the world actually uses computers as time goes on. The concepts, ideas, and demands are stuck in 1991 and are basically “man shakes fist at capitalism,” while writing capitalist exceptions into the very Tivoization clause in question under pressure.
> not people dabbling with free software who by definition don’t pay for it
You and I don't seem to share the same definition of Free Software. I personally pay for, and know others who pay for Free Software. Free is about freedom, not price.
> not to mention Android and probably Teslas in their current form, probably would have never happened were they not able to embed Linux in a controlled manner without having to invest in catering to the four total users who will want to build a system image and reflash the firmware on their whatever.
All they have to do is not go out of there way to lock down the ability to flash the firmware. It does not require extra effort to support this. It requires extra effort to block this.
Open source also doesn't guarantee any type of control if it's SaaS or anchored to a closed source SaaS system. There's a ton of "but it's open source!" SaaS companies that I shall not name that leverage open source but in reality are lock-in walled gardens. The code is open but the data and network effect are not.
> If people could understand what computing was about, the iPhone would not be a bad thing. But because people don’t understand what computing is about, they think they have it in the iPhone, and that illusion is as bad as the illusion that Guitar Hero is the same as a real guitar.
There is no formal definition of computing which is furthered by iPhones.
I have an iPhone, and pythonista is the only potential thing which fulfils the criteria of "computing", everything else is convenience.
Smartphones have done an incredible amount at bringing the consumption of the internet, audio, video and rich communication via social media.
But they have not brought "real world computing", because "real world computing" is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes and development of both hardware and software. It has scientific, engineering, mathematical, technological and social aspects.
>"In a general way, we can define computing to mean any goal-oriented activity requiring, benefiting from, or creating computers. Thus, computing includes designing and building hardware and software systems for a wide range of purposes; processing, structuring, and managing various kinds of information; doing scientific studies using computers; making computer systems behave intelligently; creating and using communications and entertainment media; finding and gathering information relevant to any particular purpose, and so on. The list is virtually endless, and the possibilities are vast."
I basically would half agree that iPhone brought more computing. Based on practical applications, iPhone has not brought radically more computing. iPhone replaced some stationary computing with ultra-mobile computing.
Cheap feature phones, that allowed people in Africa to make cashless payments, brought more computing to ordinary people... than expensive iPhone - that is owned by people who have/had other computers.
People who formally require computing - engineers(all kinds, incl software and structural), data collectors, music professionals and so on - still rely on other forms of computing. Some have shifted to iPad, which made computing more fun. But then all of the heavier forms of computing - they are still done on a "computer", not a phone or iPad.
Yeah I think the distinction is more with OS than the form factor.
With a close garden and close sourced apps, there are basically nothing you can do (to make sure you're not being monitored). Completely at whim, true for regular computers too, but a lot more choices exist for the latter group.
But computers are another story. So far we still maintain some privacy there thanks to Linux.