I'm not as much of an overhead strategist, but I do have a rule that I follow that matches this article: if I hesitate to start working on a problem because it seems too difficult, it's because that problem has not yet been broken into small enough parts.
I tend to hesitate because I know exactly that it will be a lot of long and difficult work to break everything down into small enough parts, of which there will be a whole lot, and work through them and integrate them all.
I agree, I follow the same principle. Also i would like to extend it to - "if you slow down when working on a problem, you might have stumbled upon something unexpected, identify it, and break it down.
I have a similar rule when writing documentation. As soon as I find myself writing something in the passive voice, I know I’ve hit part of the system I don’t really understand. “This event happens” instead of “subsystem A triggers this event”.
Nitpick: “This event happens” is not in the passive voice. “This event is triggered” is in the passive voice — and so is “this event is triggered by subsystem A”. (What you probably mean is “writing something vague or lacking agency”.)
One of the most frustrating things in life is dealing with someone who says one thing, but really feels a different way (this is obvious after writing it down). Consider someone who is asking for advice for a problem that they don't really want to solve. i.e. somebody is overweight, but they don't really want to change anything about their lifestyle to lose weight. They value the food they eat and their activities more than any consequences they might bring. So, any advice given to them on losing weight would be dead on arrival - but the receiver would never admit that.
Working on something like that would drive me absolutely batty. I am happy you were able to find your zen in the middle of that chaos. This post truly speaks to the human condition
disclaimer: some of these groups hate each other, I have no affiliations and don't know the history of them, I'm just compiling resources in no particular order here.
> I have set out to meet and talk to a small but growing band of luddites, doomsayers, disruptors and other AI-era sceptics who see only the bad in the way our spyware-steeped, infinitely doomscrolling world is tending.
I would say at this point that it's harder to see the good?
not to add to the doomerism, but I often wonder about how much AI-generated content I've consumed without realizing it - especially from times before generative AI became mainstream
I don't get this angle at all. To me that's like "organic" food labels. What do I care if my content is "AI" made. When I watch a CGI animated movie there isn't a little artisan sitting in the video camera like in a Terry Pratchett novel, it's all algorithms anyway for like 30 years.
When I use Unity I write ten lines of code and the tool generates probably 50k. Ever looked into the folder of a modern frontend project after typing one command into a terminal? I've been 99% dependent on code generation for ages.
Does it matter to you whether you're interacting with a human on some level when watching a show or movie, specifically on an artistry level?
Maybe some movie you've watched has been spun up by a Sora-like platform based on a prompt that itself was AI-generated from a market research report. Stephen King said that horror is the feeling of walking into your house and finding that all of your furniture has been replaced by identical copies - finding out that all of the media everybody consumes has actually been generated by non-human entities would give me the same feeling
>Does it matter to you whether you're interacting with a human on some level when watching a show or movie, specifically on an artistry level?
Yes it matters to me a great deal. But there's a reason Stephen King made that observation a long time ago. All the actors in a modern Marvel movie look like they've been grown in some petri-dish in a Hollywood basement and all the lines sound like they come from LLMs for the last fifteen years. There's been nothing recognizably human in mass media for decades. 90% of modern movies are asexual Ken doll like actors jumping around in front of green screens to the demands of market research reports already.
I'm not saying the scenario isn't scary, I'm saying we've been in that hellscape for ages and the particularly implementation details of technologies used to get us there ("AI" in this case) don't interest me that much. And in the same vein, an authentic artist can surely make something human with AI tools.