Hacker Newsnew | past | comments | ask | show | jobs | submit | matthewsinclair's commentslogin

Congrats! Nice work. I’m a fellow “Phoenix |> Elixir” fan and built something similar here [0]. I’m still ironing the kinks out of mine, but I’ve moved all of my personal sites over to it and it’s been running for almost a year now.

[0]: https://laksa.io


And there’s also the Ash framework for Elixir.


Totally agree with this, too! The books are great and I’ve read all four a few times over, but the audiobooks are something else. The guy who reads them manages to strike a near perfect balance between “reading” and “acting” that is just such a pleasure to listen to. I think we must have listened to it beginning to end about another three or four times as a family during long car trips!


That's the thing. It is acting. All the characters are played by actors. And they just simply nail their parts. You're right that it's an equal and almost perfect mix of acting while not changing the tone or impacting the content via the same acting process.

It legitimately is the best audio book I've ever heard. I think it's because their voices and tone and cadence matches what I hear in my head when reading the book.


100%. One of the genuine great writers.


Vale Dan Simmons. You brought the world a _lot_ of joy.


Hello from the other side!

I was about 45 before I realised that when people said “in my mind’s eye” it was substantially more than a metaphor.

And it wasn’t until about a year later that I realised that I also didn’t have what ordinary people refer to as an inner monologue.

Realising that I had both aphantasia and anendophasia was quite a shock, but has never felt to me like I was missing anything.

For images I literally have nothing “pictorial” or “graphical” at all, but concepts and relationships are “vivid”. And for the inner monologue, there’s no autonomic voice at all, but if I concentrate in the same way that someone might “consciously breathe” I can kinda sorta trigger something.

Interestingly, in periods where I have meditated for >20min per day for consecutive weeks, I can trigger what I refer to as “flyover mode” which is like a literal landscape flyover that feels like a 4K screensaver. But this is _rare_ and requires a huge amount of effort.

Weird, eh!?


Oh hi! Do you have vivid dreams??

I don’t know if it’s a blessing or a curse, but I generally have extremely vivid dreams. I also have a very active inner-monologue and can do the whole “picture a red apple on a green lawn with 3 yellow dots on it, pick it up and rotate it and track the dots” kind of thing. As it happens I’m a visual learner and a voracious reader.

I’ve never thought about if any of this was connected before. I should do some research.


I have dreams, but they're almost always non-visual, at least in the traditional sense, or when compared to other people I know who recount their dreams to me, and I very rarely "remember" (if that's the right word) them in any detail. I sometimes (rarely) will have visual dreams, but that almost always happens after I've had a good night's sleep, woken up, and then fallen back to sleep again. I think it might have something to do with how deeply I sleep when I am asleep. A train could drive past my bed and I wouldn't wake up.


I agree!

“Why LLM-Powered Programming is More Mech Suit Than Artificial Human”

https://matthewsinclair.com/blog/0178-why-llm-powered-progra...


I agree [0]. Well, taxed rather than banned. But we’re in the same postcode.

[0]: https://matthewsinclair.com/blog/0177-what-if-we-taxed-adver...


Here’s a counterpoint that attempts to argue an alternative view:

https://xeiaso.net/blog/2026/markdownlang/


StrongDM is shipping software in "markdownlang": https://github.com/strongdm/attractor


Indeed. There’s definitely a bit of that in there, no doubt. But I think, after reading it, that it’s one of those articles that might off on some details but is in the ballpark in terms of trajectory.


The trajectory is simple: everyone is hooked on paid AI subscription and loses the skill of thinking and learning. The models are proprietary and cloud-only. Everything you input is also monetized. The end.


If AI takes over everything even remotely in the radius of "knowledge work" and leaves 25% of workers jobless, where is all the demand for this new "productivity" going to come from. Robots buying stuff from each other? Our benevolent billionaire overlords stepping up their consumption to compensate?


We don't know and that's why the next few years are not looking bright. This is going to massively accelerate existing income inequality until it breaks our economy. Some have suggested universal basic income as the relief, but good luck getting the United States Congress to deliver that for its people. We're talking something much bigger than The New Deal and that only happened because the fucking Great Depression happened first.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: