I love ASCII/terminal games. The creativity involved with creating a graphical game in something that was only ever meant to display lines of text is super interesting to experience for yourself. This project far surpasses my own personal library for terminal games, well done.
This is truly incredible. I want to remind everyone about Blender's humble beginnings. Great work, I look forward to seeing your product pop back up as others discover it.
This article is awesome. I've always wondered why Chile is that shape and I didn't know about the Chilean dialect of Spanish being so far off from the others. Super cool.
I enjoy projects that push the limits of what we expect from terminals. However, as an end user, I would be highly annoyed at a terminal with effects like this. My favorite thing about terminals is the minimal cruft from unnecessary peacocking. Every other mode of application has decades of built-up obligation to be flashy or entertaining in the way it presents information. Not terminals, though, that's the last place where less is more.
This was my initial reaction as well, but after some more consideration, I concluded:
1. This isn't for me
2. But some users have completely different expectations/desires
3. There's a place for terminals with features like this alongside standard terminals that remain simple. vi vs. vim vs. neovim come to mind here
3. I could see this being valuable for content creation that features a terminal, where communicating what is happening is more important than the local experience I'd otherwise expect to have while doing daily work
5. This would be even more valuable for narrative/story telling where the goal is to show something that's real while trying to appeal to a broad audience
Not my cup of tea, for the most part, but I can see a place for it.
so much effort is put into making terminals, aesthetic, customizable, and impressive. yet, not enough work is done in fixing the usability limitations of terminals (especially for casual users or beginners) such as: -command/argument discoverability, error forgiveness (being able to roll back a destructive command or detect and warn the user before executing), and general ergonomics
Just saw this, not in the habit of checking back on comments (whoops). Casey Faris has been a huge help to me over the years: https://www.youtube.com/@CaseyFaris
This reminds me of Steven Pressfield's works. The professional does what they need to do to complete the work. Once the work is done, they being to allocate for the next one.
Bikes, dirt, legos, board games, books, all still exist. Despite what American consumerism would have you think, devices aren’t required to parent your children. We made the decision from the outset not to introduce them into the mix and this insane thing happened: they don’t spend their time on devices.
Sure, their classmates will have them and they will come home asking for this awesome thing they saw so-and-so have at recess. When that happens, you get to do the most important part: you say “no”.
The point of peer review is not to debug others' code (sometimes it helps and that is welcome). The point of peer review is to ensure the change adheres to team standards - both technical and design - as well as to give a chance for domain knowledge share. With that in mind, trust is a foundational aspect of peer review, just not in the way you mentioned.
I am curious to learn what concepts we may have lost and not just abstracted away. Is the knowledge lost or niche? Is it necessary for programming in general or a subset of the practice?
I read the article but didn't see exactly what has been lost.
> I read the article but didn't see exactly what has been lost.
TFA contains this: "The graphical display is the servant of information, not the master of it." and "the computer being a tool we use to better our lives".
I think what has been lost is that instead of a slave (not my terminology, it's the one in TFA btw) at your service, the computer has become a master and you're the slave. At the mercy of a few gigantic corps and states spying your every move and turning you into a consumer, instead of a producer.
It's made painfully obvious with the ultimate consumption device: the smartphone. Which are mostly used to consume pointless, debilitating, content and not to produce anything of value.
The computer as a tool still exists though: architects, 3D artists, accountants even (yup, they're needed), engineers, music creators... These professionals use actual computers, typically with gigantic screens, as a tool to help them.
It used to be only that way, for everybody, and that's what's been lost. The masses are mindlessly consuming content of exactly zero value on what they believe is a computer ("My smartphone is a more powerful computer than the computer of the 70s/80/s/90s!").
That's how I interpret it and I take I'm not very far from what TFA means.
I think what has been lost is that instead of a slave (not my terminology, it's the one in TFA btw)
Actually, it is your terminology! I used the word 'servant' not 'slave'.
It was a deliberate choice. My commentary was more about how our visual presentation now takes centre stage, often at the expense of displaying information clearly or usefully. This is a mindset shift and not really a technical issue. So master/slave didn't fit, as that has more precise technical definitions; database replication, disk arrangements, etc.
There's nothing quite as fast as keyboard navigating a text input screen, tab key going from field to field. Learning the order that the tab key takes you through the screen. Pure muscle memory, flow from entering information. Web pages have different navigation paths, or none at all, for each web page. That's one big thing that's lost.
> It used to be only that way, for everybody, and that's what's been lost. The masses are mindlessly consuming content of exactly zero value on what they believe is a computer ("My smartphone is a more powerful computer than the computer of the 70s/80/s/90s!").
I don't think this has been lost. I think there are more people than ever using computers as tools. But there are also people using computers as consumption device. And also people using computers as appliances (for example a smartphone through a very specific app, or my instant pot). At home I use my computer to code (can be seen either as tool if I'm doing something "useful", or as a hobby/practice), to watch movies (as a consumption devices), to organize my movie collection (kind of between tool and consumption device).
>I think what has been lost is that instead of a slave (not my terminology, it's the one in TFA btw) at your service, the computer has become a master and you're the slave.
The gig economy apps like Uber and Doordash always seemed so dystopian to me because effectively your boss is an app.
I think it's the height of arrogance to declare much of anything about "the masses". You don't know other people's lives, so stop judging them. xkcd managed to be perfectly relevant here years ago: https://xkcd.com/610/
I work from home now, enabled entirely by computers. I'm not in an office, no one watches my day to day activities, work gets done and that's the evaluation. By every possible metric, modern computing has set the rest of my life free.
The older I get, the more often I have the opposite thought of that XKCD. I look around and see so many different stories, each with their own main character.
I maybe could have done a better job of what so captivated me watching the 'The Computer Programme' and why that feeling's so different today. The idea of ordinary people buying a computer and programming it to do useful things was one. Personal address books was covered in an episode - that's gone. There was the implicit assumption that your machine and your data was your own.
As for specific technologies, here's my rough list of what we're either starting to forget, had to re-remember recently, or have completely forgotten:
- low latency, simple, native UIs that don't require a designer
- operating on data locally first, and not over a network
- information highway instead of doom-scrolling
- message passing (constantly re-discovered, and then forgotten because people complain it's not RPC)
- relational and logic programming - I firmly believe this will come back one day, but it will be called something else and look new.
- static memory allocation. turns out it's still incredibly fast to do this.
> low latency, simple, native UIs that don’t require a designer
This is very much alive for internal tools, but the importance of design from a product perspective has proven important, which is why you don’t see much in products anymore.
> information Highway instead of doom-scrolling
Those are just buzzwords, what exactly do you mean by this?
> relational and logic programming
SQL is alive and well, granted newer programmers don’t learn it as early as they should
> static memory allocation
I’d like to hear more of your thoughts on this as well. We’ve found that this kind of memory allocation is error prone especially in multithreaded workloads, which is why it’s not as popular
> software design
Another buzz term. Software is constantly designed and the design of software is constantly discussed.
What exactly do you mean by this? Who forgot what?
This is very much alive for internal tools, but the importance of design from a product perspective has proven important, which is why you don’t see much in products anymore.
This is revisionist history. Desktop environments had become so complex and fragmented that just writing HTML & CSS seemed incredibly appealing. "the importance of design from a product perspective has proven important" is unsubstantiated, and is justification after the fact.
Those are just buzzwords, what exactly do you mean by this?
A focus on information vs a focus on 'engagement'. Knowledge vs addiction. Feeling better after using a computer as opposed to feeling worse.
SQL is alive and well, granted newer programmers don’t learn it as early as they should
The tigerbeetle database is all statically allocated and (as I understand) makes no memory allocations as a response to use requests. They seem to be having great success with this approach.
Another buzz term. Software is constantly designed and the design of software is constantly discussed. What exactly do you mean by this? Who forgot what?
I don't feel like this is a buzz term. Agile or whatever one wants to call sprint based work flows means no serious design gets up front anymore, and so we constantly try and code our way out of anemic or non-existent designs.
1. There’s nothing revisionist. Game developers constantly make small internal tools using native UI toolkits.
I don’t understand how you could think good design being good for a product is unsubstantiated. You’re arguing in bad faith there. Apple’s entire business model is selling their product design over capability. It’s obviously important.
2. That’s fair.
3. This looks interesting, but also very academic. Did that pattern ever see wide production usage?
4. A single project both doesn’t refute my point and refutes yours.
5. That just isn’t true. There’s room for good design. We do technical specs where I work along with some agile methodology.
We just recognize that time spent hammering designs up front rarely produces better or more maintainable results in the same amount of time. Though I suspect this is very domain dependent.
It's really interesting to read about the vision that drove Douglas Englebart (inventor of the computer mouse, did the "Mother of All Demos" where he introduced a bunch of modern computing tech for the first time). Englebart was a hippie who envisioned a future where researchers used the tools he was creating to collaborate remotely and work to solve complex problems. His tagline was "Boosting Our Collective IQ"
I meant relational programming in the sense that William E. Byrd uses it, as something like a close cousin or a different way of looking at logical programming
miniKanren is being used for research in "relational" programming. That is, in writing programs that behave as mathematical relations rather than mathematical functions. For example, in Scheme the append function can append two lists, returning a new list: the function call (append '(a b c) '(d e)) returns the list (a b c d e). We can, however, also treat append as a three-place relation rather than as a two-argument function. The call (appendo '(a b c) '(d e) Z) would then associate the logic variable Z with the list (a b c d e). Of course things get more interesting when we place logic variables in other positions. The call (appendo X '(d e) '(a b c d e)) associates X with (a b c), while the call (appendo X Y '(a b c d e)) associates X and Y with pairs of lists that, when appended, are equal to (a b c d e). For example X = (a b) and Y = (c d e) are one such pair of values. We can also write (appendo X Y Z), which will produce infinitely many triples of lists X, Y, and Z such that appending X to Y produces Z.
Thanks for the reply! I hope it didn't sound like I was insinuating that you did a bad job, I was just curious what concepts others would bring up in a discussion about your article's thesis. It was a good read! I look forward to more.
I think the big issue is that the stuff that already exists has so many features and so many ecosystem integrations, nobody wants to use anything simple enough to DIY.
It would be easier then ever to make a simple address book, but it wouldn't do everything Google does.
The other issue is that we never really got a proper upgrade from spreadsheets. I think you could do a VB-like studio that let average people make modern apps that they'd actually want to use, if it was free, as easy as a spreadsheet, and a cross platform local app, not something self hosted, and had a ton of random features that would interest users.
As a minor point I'd say a UI always needs good design, even more so the more minimal it is. UX matters and I'm disappointed how many developers treat it as just "making things pretty."
Yes perhaps I should choose my words more carefully.
What I was more getting at is the lost idea of a 'set piece' UI we can use to quickly make things that look nice - rather than re-inventing the wheel with styling.
Like imagine if the default browser styling was good enough, and only people with particular artistic flair needed to bother with CSS or component libraries or what have you.
There’s a documentary that’s in post-production named “Message Not Understood: Profit and Loss in the Age of Computing” that chronicles Xerox PARC’s research on graphical user interfaces and personal computing and how modern personal computing and the Web have deviated from the visions of Xerox PARC researchers. I’m highly interested in watching it once it is released.
In my opinion, what we’ve lost from the early days of personal computing is a sense of empowering users by giving them tools that not only they can use, but they can extend and even modify. Sure, today’s hardware and software are more capable than ever, and these extra capabilities do empower users in the sense that their tasks are made easier. But what about the ability to shape their environment? All too often software these days is locked behind walled gardens and binary code. Users generally have to accept their software packages as is. If Slack or Zoom or Photoshop changed its interface, too bad; you gotta just cope. Even FOSS can be an impenetrable mass of hundreds of thousands of lines of C and C++ code that even a seasoned software engineer would have to spend a week or two studying the codebase before making modifications, a far cry from the much simpler AT&T Unix from the late 1970s or the Lisp machines of the 1980s.
Even more frustrating than the complexity of modern software is the increased commercialization of personal computing. Ads, subscriptions, and tracking is everywhere. Data is increasingly locked in cloud services that often don’t respect users’ privacy. In short, personal computing back in the 1970s and 1980s was about taking power away from mainframes and giving it to the people, but it seems that the past two decades have been efforts to bring the power back to large corporations who dictate the terms of our computing experience.
What can be done about this? I think the FOSS movement has been a wonderful start, but there has to be an effort made to make a simpler software stack that makes it easier for users to extend and modify their tools. I’m also very interested in idea of community-driven, non-profit cloud services as alternatives to Big Tech. Computing is too important for society to be controlled by a tiny handful of corporations.
To give an obvious example, it seems every generation re-discovers the concepts of encapsulation and abstraction every 5 years or so. Each generation thinks we’re so smart for having figured this out and how much easier software development becomes when your code has a clear purpose and interface.
UI frameworks, OO, and databases/SQL get reinvented every 5 years or so, usually by younger programmers who seem particularly ignorant of the lessons learned from decades of research and development in these areas.
To be fair, that's the way of young people in every endeavor. When you are full of energy but inexperienced, you have a tendency to present new solutions that you don't realize aren't actually new.
The latter one is more business people driving the operation.
They want to assume the user is an idiot, because idiots are often the desired customer:
* Broader market. For every N people who'd buy AutoCAD, there's probably 10N people that would be able to run Baby's First Drafting Suite.
* Lower expectations and support costs. The professional with 10 years experience knows when a given piece of software isn't meeting expectations. He'll ask pointed questions and consume actual support resources. The idiot can be steered into a chatbot and gaslit into blaming himself.
* More susceptible to dodgy schemes. A professional who has to look at the actual cost/return of his investments and deal with corporate policy might not be as interested in subscription schemes and monetization shenanigans, while it might expand the addressable market for idiots. They might not buy into $100 one-time but will end up paying $10-per-month forever, or "we'll lock your data into a closed ecosystem and it will cost you to liberate it later."
Reading about virtualization and old IBM systems gave me massive respect for what our predecessors achieved. I wonder what other treasures are out there lurking in the literature.
Data diodes are alive and well, just not in a lot of places. Most people don't need them, and organizations that do either have them, have decided that an air gap is easier to deal with for some reason, or are being run (on the IT/infosec side) by people who have no business being in their position.