Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A non-standard book list for software developers (mihaiolteanu.me)
450 points by molteanu on March 12, 2022 | hide | past | favorite | 104 comments


In a similar topic of books about system design, someone on HN recommended A Pattern Language and The Timeless Way of Building by Christopher Alexander et. al.

They're not about designing distributed software systems... they're a complete design language for the proper design of a city, from the grossest elements to the most granular. You can flip to a random page and understand what makes a front yard really fulfill its purpose, or you can flip to a different page and learn why some public plazas just don't get used, while others flourish.

It's a template system for designing a room, a house, a neighborhood, a city, etc.

Makes you think about what a component is, what makes it "good" or "bad" at what it's supposed to do, and what makes it harmoniously integrate with other components.


Alexander is fun for sure for thinking about how we got where we are (I suppose if you were able to build your own house it could also be directly used to achieve various goals, which would be fun).

Along similar lines I would recommend Donella Meadows' work, for example her essay Leverage Points[1].

1: https://donellameadows.org/archives/leverage-points-places-t...


https://dreamsongs.com/Files/PatternsOfSoftware.pdf is a great read that cites Alexander's work and talks about how that influenced OO design. Very accessible too


25 years ago my (then) wife and I both read Christopher Alexander while designing a house. It was great.

I'd heard of him, and read those books, because I was (am) a computer programmer, and that was before we decided to build a house!


I haven't heard of this book, is it actually about designing cities? I'm probably way off but all I can't think of is that once we started designing cities we built crap. Take the UK for example all its beautiful amazing places were built before we had planning offices. They seem to just have evolved.


No, houses. Each "chapter" is just a page (well both sides of a leaf), sort of like The Society of Mind.

One example is (paraphrased) "find the nicest spot on the property and don't build on it, because then you'll lose it". Another is "corridors are dead spaces; you move through them, and pretty much ignore them, on your way to do something". Not gospel or absolute rules; rather rules of thumb, or ways to talk about and think about what we wanted to accomplish.


Thanks, that sounds really interesting.


If you're interested in his thoughts on cities, have a look at his "A City is not a Tree".

https://www.patternlanguage.com/archive/cityisnotatree.html


I would suggest Essense of Decision (https://en.wikipedia.org/wiki/Essence_of_Decision)

Because when you're working on a larger software team, or even a small software team within a larger company, understanding how organizations actually understand the world and make decisions is invaluable.

A person can drive themself crazy expecting groups of people to behave like a monolithic rational agent.


Thanks, with the recent HN ruminations on late Von Neumann strategies this is refreshing.


I feel like there must be better mathematics books than those two if what you want is a taste of mathematics. What you get from Euclid is the idea of proofs from a set of axioms, but Euclid in the original is quite a painful way to get introduced to that. And I've never understood the obsession with Spivak. Of all the calculus and analysis books I read, it definitely wasn't up there as anything special.

Not sure what the best books are these days to give people a flavour of university style mathematics. I quite liked Introductory Algebra and Analysis by Geoff Smith when I was teaching first year mathematics years ago but I assume that was published 25 years ago so I imagine has been superseded.


> I've never understood the obsession with Spivak

Maybe one can be so happy about having made it through such a tough subject that it has to be a good book? I mean, do people normaly study from more than one text?!

> ...to give people a flavour of university style mathematics

I've just started The Foundations of Mathematics by Ian Stewart & David Tall. So far it's really promising.


On the note of Euclid's Elements, I took a short course on Plato's "Theory of Forms" when I was just learning to really code and it's what helped OOP click in my head. It's fun when totally random subjects help you get an insight into one another.

https://en.wikipedia.org/wiki/Theory_of_forms


I learned to program, and only later read a book on metaphysics, and was rather surprised to find that the book was entirely about software design, although the authors did not seem to be aware of this.


Would you happen to remember which book that was?


I don't, i'm afraid. It was probably the shortest book on metaphysics available in my town library at the time.


Can you ELI5 metaphysics?


"Applied metaphysics

Although metaphysics as a philosophical enterprise is highly hypothetical, it also has practical application in most other branches of philosophy, science, and now also information technology. Such areas generally assume some basic ontology (such as a system of objects, properties, classes, and space-time) as well as other metaphysical stances on topics such as causality and agency, then build their own particular theories upon these.":

https://en.m.wikipedia.org/wiki/Metaphysics


To extend and elaborate on the analogy further : Plato's forms is class-based OOP, the bluebrint is categorically different than the objects it generates, it is the sole source of the objects it generates, and it always comes before the objects it generates. It's also a complete description of every single object it generates: there is not a single thing you can do to the generated objects that is not precisely described and predicted by the blueprint*.

There's another way : Nominalism, abstract blueprints and concepts don't exist, they are just artificial markers we put on sets of concrete objects to express some degree of similarity. That is, There's no such thing as a 'Tree' in the abstract, there's only the green thing with roots and branches under your house and the green thing with roots and branches under mine, we gave them (and countless other green things with roots and branches) the same name merely to express that we view them as equal under some circumstances, but that name doesn't express anything about the concrete objects themselves, just our brains, each object is completely unique and incomparable to any other in any objective way.

Nominalism is similar to prototypical OOP, there is no such thing as a blueprint, concrete objects spring fully-formed into existence directly from the mind of their creator, or they are cloned from other concrete objects. If blueprints exist at all, they are only external descriptions by us observers to denote similarities: that object has x: and y: attributes and that object also has x: and y: attributes (possibly because it's a clone of the first) so they are the same 'type', but those objects don't 'know' they are the same type (well, they do if they're clones, as they point to the same parent, but whatever), and the universe allows any object to grow and shrink and morph independently of any other, similarities are imposed/discovered solely by observation from the outside.

*:This principle might be violated in dynamic object systems with sophisticated meta-object protocols, as the blueprint itself is just an object that you can manipulate and mutate at runtime with various things not in the static description.


Yes, those Forms have had a lasting impact on Western Philosophy for a long time. I wouldn't be surprised if some of it hasn't had some impact on the OOP thinking, in general, even though indirectly. If you read on Hume and the British Empiricists from the 18th Century, for example, you'd be surprised how many "modern ideas" you find in there. After all, programming languages at large have a long history of philosophical ideas before them. Before they could come into being, that is.


Absolutely this! Some Neoplatonist literature distinguishes between God qua engineer of forms and God qua reifyer of them. And there's object inheritance implicit in their hierarchy of forms/angels, ie. Eriugena's that which creates and is created.

Platonism is like ontological OOP. :D


> Proclus (ca. 335 BC)

Proclus, whose quote opens the section on Euclid, lived more than 700 years after this date, well into the 5th century AD. Euclid himself wasn't born till about decade after, in 325 BC.


That is true. I've updated the info, though I couldn't quite find the exact date of the publication of "Proclus' Commentary on the First Book of Euclid's Elements".

Good catch! Thanks for the heads-up!


Just wanted to thank the author for his project of translating the Lessons In Electronic Circuits in romanian. It was so useful to me when I was a student at UT Cluj. I remember recommending that website to all my peers.

Thanks for your amazing effort!


I was a student at UTCN, as well, and I found it hard to get a decent book on electronics to learn from. I've found the lessons in Electronic Circuits online, used it and then translated it for my fellow engineers, that's how much I liked it. Unfortunately the original website got hacked, so I did my best and recovered the content in its current format.

You're welcome! Thanks for your input!


A good non-standard Applied Maths text is : Methods of Mathematics Applied to Calculus, Probability, and Statistics by Richard Hamming.


Has anyone here read LISP in small pieces? I've never used LISP but would still like to read it. Would you recommend it? (it's not cheap, otherwise I'd just buy it without the due diligence)


I've read most of it. It's excellent. You have to pay a lot of attention to get anything out of it, much more than is required for SICP, PAIP, or The Little Schemer. You will spend enough time at it that the monetary outlay will be insignificant...


In my opinion it's verbose and more complicated than necessary. If you just want to understand programming language design with a lispy flavor I would suggest Essentials of Programming Languages by Friedman, Wand (not much theory, scheme implementations of everything) and Design Concepts in Programming Languages by Gifford, Turbak, Sheldon (which uses s-exps to represent ASTs, but otherwise is not lisp centric). If you already know some PL theory and don't care too much about lisp, Bob Harper's Practical Foundations for Programming Languages is great, full of interesting topics (but it doesn't hold your hand as much as the previous two)


The content is of great quality that's for sure. Don't regret having a copy.


You can also do an HN search for it. Here is one excerpt from an HN comment: "... So for example, a book like "Lisp in Small Pieces", which taught me the concept of Continuations, was considerably hard on the conceptual side. This also took at least a month of conscious connecting the dots between what was in the text, tail-recursion, and those funny callback functions I've been writing in nodejs, before i "Got it".

https://news.ycombinator.com/item?id=2418459


You can find it on the typical pirate sites (pdf and epub), I haven't been able to find sample chapters elsewhere. Of course, if you like it, buy the original to get the physical book :)


I wouldn’t recommend it unless you are already an expert in Lisp.


I can't comment on the compsci books (they do seem interesting), but I can comment on the mathematical ones where I have expertise:

- Using Euclid is manifestly a really bad idea, since his way of formalizing geometry is not the sharpest. Mathematical logic has developed since Euclid published his book thousands of years ago and Euclidean geometry has been re-formalized a number of times to really flesh out the theory behind it (where the word theory is a precisely defined mathematical word), the most well known being oerhaps the one by Hilbert (still 100 years old).

- Motivating Spivak with "the most important thing to learn is the method. That is, to develop a method for thinking, based on demonstration following a fixed and known set of starting-points or axioms ...". This can be actually said of any mathematical theory (here I use the word 'theory' in its colloquial meaning). Studying calculus in particular makes little sens for compsci. Rather, graph theory or abstract algebra might be more worthwhile to learn - basically any subject that touches theoretical computer science significantly.


They don't recommending Euclid as a way to learn geometry, they recommend him because his book has been the foundation of the scientific dialog for the last several millennia. Regardless of whether his formalization of geometry is the most up-to-date, every subsequent mathematician and logician has been building on The Elements. That fact alone makes it valuable to read, not as a way to learn geometry, but as a way to understand where we came from.


There's a little story told about the philosopher Thomas Hobbes, by John Aubrey:

"He was (vide his life) 40 yeares old before he looked on geometry; which happened accidentally. Being in a gentleman's library in ..., Euclid's Elements lay open, and 'twas the 47 El. libri I[Pythagorean theorem]. He read the proposition. 'By G--,' sayd he, 'this is impossible!' So he reads the demonstration of it, which referred him back to such a proposition; which proposition he read. That referred him back to another, which he also read. Et sic deinceps, that at last he was demonstratively convinced of that trueth. This made him in love with geometry."

Reading all of Euclid's Elements is a big undertaking, though I think you can get a sufficient taste and appreciation of iron-clad logical reasoning and demonstration from going through some subset of it. Perhaps working through a single proposition back through its base propositions, definitions, common notions, and postulates will be enough to alter your thinking and at least see what's possible. Unlike the author I can't tell from discussion sampling if people are ignorant of Euclid, but I have been frustrated by 1-on-1 failures to win arguments by mathematical proof (e.g. that 0.999.... = 1) and such an unwillingness to accept such things shows a profound disconnect in how we look at the world. Reading Euclid may help that.

On the other hand, lots of software developers are exposed to proofs as part of their formal education (those who received one), so how much Euclid can add here is questionable, vs. actually putting proofs into practice by learning things like TLA+, or learning about probabilistic inference which is more needed outside crisp and clear worlds like Euclidean geometry. Personally I'd sooner have software developers take a few minutes to learn and reflect on Chesterton's Fence, than working through examples of geometry proofs, and maybe some can learn to reduce their bad habit of sloppily "reconstructing things from first principles" (where "first" is frequently "first thought of").


Then you, as Moicanu below, didn't understood my point: You learn less from the Elements than from later, curated sources. Analogously, no one would advocate reading Newton's Principia to "understand where we came from". You would first learn the theory (geometry resp physics) with up-to-date sources, then read a commentary of the original (which is often longer than the original), and only then you'd be able to get something out of reading a translation of the original.

Thus, by reading the original works you will confuse yourself with the obscure notation you will encounter, semi-circular arguments and other problems (both the Elements and the Principia are riddled with problems), which actually have prompted many many scholarly works commenting and fixing these.


I've read the 'original' Elements, indeed. Apart from the geometric figures which were not even available in the original form of the manuscript, but added later as an aid, there is no obscure notation whatsoever. Indeed, there is no notation at all. It's all prose.


If you want to read it more as a historical piece of art I wouldn't object; if you want to actually get some mathematical insight out of it on the other hand, this would be a good starting point:

https://math.stackexchange.com/questions/845288/has-any-erro...


One of missing pieces in education is surfacing problems, deadends and struggles of different sorts. Shame because it’s such an insightful part, often allows grasping underlying reason for why things are the way they are.


Nicely said.


>Studying calculus in particular makes little sens for compsci.

yeah it makes little sense if you don't want to be able to write programs that solve the world's major problems

so if you are content with writing cat picture apps, skip calculus


We were just talking about this over lunch yesterday (software engineers at a medical device company). I won’t dismiss calculus, but it was generally agreed that linear algebra and statistics were most important, at least for what we’re doing.


Must be something common about medical devices. When not doing linear algebra and stats, I need optimizations from the wheelhouses of linear programming and numerical analysis.


Although for statistics (and probability as well), Calculus has a way of sneaking in there.


not sure why anyone downvoted this, because in my stats for engineers class we were doing double integrals


Also I'm not so bearish on letting people choose not to be smarter.

I think society is better off giving people the best means to achieve the intellectual goal they want, rather than forcing a greater number to learn beyond what they are compelled to. The kind of people I enjoy talking to are all the kind of people who would learn this stuff anyway as a matter of habit. My bookshelf is not full because of a state mandate.


You need a lot more than calculus to actually be useful at doing that though, unless you mean just throwing AI at the problem.

If you want to write a CFD solver then you need a lot more than just blindly learnt calculus. If you want to blow up the world using fuzzily applied financial models then you might not.... (you do actually)


i'm learning ML now from an accomplished researcher

to understand the courses we need stats, linear, and some calc 3


Having an understanding of mathematical foundations and proofs can be very beneficial to software developers; but I agree there are much better books than the ones suggested here. "Elements" is an historically important book (arguably the most important in math), but like you said is fairly outdated. Real Analysis is critical for higher level maths and theoretical computer science and does have some value even in software development. But with books like Spivak's Calculus you spend more time memorizing definitions and theorems than abstract thinking or problem solving.

Some better math books that I would recommend off the top-of-my-head:

* "How to Solve it" by George Polya - A great book on breaking down complex problems.

* "Mathematical Logic" by Stephen Kleene - A much more contemporary math book on building axiomatic theories from scratch.

* "Godel, Escher, Bach" by Douglas Hofstadter - Also about mathematical foundations but for a much broader audience.


I agree that for anyone wanting to improve "logical thinking" these books are a good start. In particular the last one is recommended, if you want to start having doubts about logic itself and the limits of logical reasoning ;)


If one wanted to read all three of those, what order would you recommend?


I am a big proponent of first principles learning but learning a completely adjacent field to be a better developer is like in fhe Karate Kid when he learns to block punches by sanding a deck. Even if the skill transfers its massively less efficient than just practicing the thing you want to get good at.


> Studying calculus in particular makes little sens for compsci. Rather, graph theory or abstract algebra might be more worthwhile to learn - basically any subject that touches theoretical computer science significantly.

Unless of course you have any interest in machine learning.


i think if one were looking for a more philosophical approach to calculus, apostol would be a good choice.

i personally have found calculus to be useful in a fair number of domains i find interesting. anywhere computers process signals from the physical world, it tends to show up, whether it be sound, vision, or other signal types entirely.

not to say that graph theory and abstract algebra aren't important (especially in things like discrete algorithms or cryptography) just rather if you want to build systems that interface with the physical world, you need the original language of physics as well as all the linear systems and sampling theory to go with it.


You've missed the whole point.


Could you elaborate? "You've missed the whole point" is completely non-helpful.


Quoting the article first paragraph I think is the best response

"I view the field of software development as a big logical system with highly interconnected and complex parts. Understanding such a big systems naturally requires having an excellent grasp on the tools used to build them. And the most fundamental one is logic itself. What follows from what, what are the starting-points or the elementary parts, what are the ways of composing these into more complex ones, ways to spot complete non-sense in the system and how to decompose the complex parts back to more fundamental ones to check their consistency and truth. The following book list contain titles that are all playing fields for one to start experimenting with such systems and gain the required confidence that one can master them. "

The recommendation are from that point of view and the authors selection from book he actually read and consider good for it.


Standards of rigor evolve with time and the level of rigor in the Elements is abysmal compared to today.


@throwaquestion5, thank you!


Spivak, 2nd Edition, is how I taught myself calculus. Teaching myself calculus was the deal I made so that the head of the university mathematics department would let me take numerical analysis as a nonmatric. I have Pearl's ("fancy curve fitting") _Heuristics_ on my bookshelf as well.


This looks great - thank you for sharing. Now, if this (getting rapidly older) software engineer just knew how to find a mid-(late?)-career job change where the interview consisted of geeking out about these books instead of if I know how to reverse a queue with the least amount of memory...


Good books but still quite standard. Spivak’s is standard on initial calculus courses at universities.

If you really wanted non-standard but relevant authors I would add Feyerabend, Kuhn, Kahneman, Dostoyevsky, Borges, Taleb, Montaigne, Popper, Hofstadter, Don Norman, Alexandrescu. Add something about systems of representation by Kierkegaard or Nietzche or some interpretations by later authors.


Hoftstadter is fairly near standard isn't it? Assuming you mean actually reading GEB and not The Mind's I or I Am A Strange Loop. Not generally a course text but very common reading among CS/AI students. Some of whom even read all the way to the end instead of just pretending to!

The book I would most recommend software developers read is Mindstorms: Children, Computers and Powerful Ideas by Seymour Papert.

It's a lot shorter. And probably more valuable.

And one random side recommendation: The Daughter Of Time by Josephine Tey. It's a narrative exploration of how to challenge consensus belief, explore gut instincts and follow research leads, expressed as a perception-challenging historical review of Richard III by a fictional detective.

The BBC audio book is worth a go. My copy of this book (which was my mum's copy) is one of the few things I'd rescue from the hypothetical fire. It stays with you.


Remember reading Papert's Mindstorms book and being reminded of the fun you can have with computers, these powerful modelling machines - as Douglas Adams put it too


I found Hofstadter's _Le Ton beau de Marot_ fascinating, and also almost nobody has even heard of it!


I had not heard of it either. Very interesting. Thanks!


Hofstadter's amazing, mind-boggling satirical essay/alternate history A Person Paper on Purity in Language from Metamagical Themas. Much more fun than the title suggests!

https://www.cs.virginia.edu/~evans/cs655/readings/purity.htm...


I think Spivak is way less common as an initial calc text than Stewart. Spivak's approach is much more rigorous than most calc textbooks, and is in many ways almost an introduction to real analysis. Note, for example, that Spivak prefers to view the definite integral as the proper form of an integral rather than the typical approach taken by many introductory texts where the integral is viewed as the anti-derivative and then for ∫_a^b f(x)d(x) they just take F(b)-F(a) and call it a day. I would imagine that most introductory calculus students do close to zero proofs with most of the time spent on memorizing the various rubrics for finding derivatives and anti-derivatives. Spivak covers those, but the exercises are at least 50% proofs rather than being 90–100% calculation.


The Calculus pick is completely fungible. There are at least two other popular Calc books that different people would say the exact same things about.

Euclid as the ideal place to learn logic seems debatable, although I can't think of a better resource off the top of my head.

The Xinu OS and LISP recommendations I've never seen before, so those were potentially valuable for me at least.


What calc books are those? Reading a calculus book is a massive investment because I will be rereading pages. So I’d like to find the best one first


_Calculus Made Easy_ by Thompson (now in a new edition revised by Martin Gardner) was a former classic used by all sorts of people, and Feynman learned calculus from _Calculus for the Practical Man_, which was hardly sophisticated by modern standards. As long as you stick to a book some sizeable number of people think is okay, I don't think it matters that much what you learn from; the important thing is that you work through theorems carefully enough that you really understand them. Whether it's Spivak or Velleman or Thompson or OpenStax or Stewart (only for those who are assigned it, since otherwise there's no reason to spend >=$100 on it—it's not worth the premium) or Ross (his _Elementary Analysis_ book), worry about learning the concepts more than worrying about which book you chose. You can always refer to alternative books or youtube videos if a concept isn't clear in the primary book you chose.


Others have already mentioned the classics, but if you're looking for something more off the beaten path, you can check out my book: No Bullshit Guide to Math & Physics which covers calculus in approx. 150 pages (Chapter 5).

You can see an extended preview of it here: https://minireference.com/static/excerpts/noBSmathphys_v5_pr...

And a standalone concept map here: https://minireference.com/static/conceptmaps/math_and_physic... (bottom of the page)

Check out the reviews on amazon if you're interested. It hasn't been adopted as the main textbook at any university yet, but many teachers recommend it as supplemental reading for their courses.


I, for one, couldn't find a better one at the time I've read it. There are a lot of crappy ones, on the other hand. You are right in affirming that this is a massive investment.


Depends enormously on your level and willingness and dedication. At that level the best ones are Spivak, Apostol, Fichtenholtz, Kudryavtsev.


Just be careful in mixing the "best one at teaching" with the "best one at the level of detail and correctness". One is good for aha moments, the other for going into research. Depends on your needs. Similar to the question of "what is the best language to learn?". Depends on your needs.


If I remember right, Apostol uses a non-standard sequence (integration first, then differentiation). That probably only matters if you might be using a different book for another part of the sequence, though.


I self-taught from an early edition of Stewart and loved it.


What book would you recommend for learning practical calculus from? I haven't touched the subject since high school.


https://tutorial.math.lamar.edu/ is a good place to find answers to things.

I learnt calculus from a copy of a book called "Calculus, A complete course" which I quite liked but have literally never seen anywhere else. The fact I got the book was basically by accident, completely changed my future.


Thank you!



What about How To Prove It by Velleman as a substitute?


For a truly non-standard, very good Calculus book, I recommend Elementary Calculus: An Infinitesimal Approach, by Jerome Keisler. It's available for free at https://people.math.wisc.edu/~keisler/calc.html


It's horribly inaccessible but while we're throwing out truly nonstandard recommendations, and apropos to the connection made elsewhere in this thread between the theory of forms and object-oriented programming, Whitehead's Process and Reality is good for the same reasons OOP sucks.


Holy crap, the Comer XINU book is still around. I used that in my undergrad Systems course in the mid ‘80s…


The most recent version targets the beaglebone black and the intel Galileo (now obsolete, although you can still pick them up.) I’ve encountered raspi repos on GitHub as well. I’ve found it to be a great book since it gets you working with hardware very early on.


In many universities those books are quite standard on a Computer/Informatics Engineering degree.


Anyone got a non-standard Distributed Systems book pick? Also a non-standard Concurrency book pick?


Gonna throw in one of my own picks for Concurrency, I landed on this book a few months ago called “The Little Book of Semaphores” on synchronization with examples in python: https://greenteapress.com/wp/semaphores/


Operating Systems: Three Easy Pieces, https://pages.cs.wisc.edu/~remzi/OSTEP/


Foundations of Multithreaded, Parallel, and Distributed Programming by Gregory Andrews. All the three paradigms of Concurrency are treated clearly within one pair of covers.



These are some good picks but kind of strange they don’t mention concurrency once on the page.


Slightly left field but Specifying Systems by Lamport?


I like the list. But, Lisp?


I don’t understand why old programmers are so much fascinated by lisp. If it would have been good it would have survived today.


Because we were educated before "flavor of the month" JavaScript framework... And because if that we can appreciate elegant and practical tools and ideas. See also emacs and vim.


...posted on a site that was developed on a Lisp dialect.


But it has survived - Clojure, a JVM lisp dialect, flourishes!

Feel part of what makes lisp fascinating is its simplicity and power leading to its place as a test bed for ideas such as continuations and Kanren, a new logic programming language.


It’s almost like they understand something that you don’t.


1. It's all their feeble minds can understand. (This is the answer you were fishing for, yes?)

2. Too many MIT grads were brainwashed by SICP until MIT sensibly switched to Python in 2009. (Unfortunately they probably still secretly teach SICP somewhere.)

3. Lisp/Scheme readily implements "Maxwell's equations of software" which are believed to be important or useful for some reason.

/s


I guess someone didn't see the /s or maybe OP was not amused.


'(Quality troll thanks)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: