Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think when parsing that statement it's important to understand his (and your) definition of "programmer".

We (I) tend to use the term "programmer" in a generic way, encompassing a bunch of tasks waaay beyond "just typing in new code". Whereas I suspect he used it in the narrowest possible definition (literally, code-typer).

My day job (which I call programming) consists of needs analysis, data-modelling, workflow and UI design, coding, documenting, presenting, iterating, debugging, extending, and cycling through this loop multiple times. All while collaborating with customers, managers, co-workers, check-writers and so on.

AI can do -some- of that. And it can do small bits of it really well. It will improve in some of the other bits.

Plus, a new job description will appear- "prompt engineer".

As an aside I prefer the term "software developer " for what I do, I think it's a better description than "programmer".

Maybe one day there'll be an AI that can do software development. Developers that don't need to eat, sleep, or take a piss. But not today.

(P.S. to companies looking to make money with AI - make them able to replace me in Zoom meetings. I'd pay for that...)



There are almost no programmers today (you need to do malloc and low level sys calls in C to be considered a programmer).


That's right. We invented programming AI a very long time ago, and called it an "assembler". All you had to do was tell the assembler what kind of program you wanted, and it would do the programming work for you!

Then we invented another AI to tell the assembler what kind of program you wanted, and called it a "compiler". All you had to do was tell the compiler what kind of program you wanted it to tell the assembler you wanted, and it would do all the not-exactly-programming work for you!

And so on...


P.S. Visual Basic with its GUI designer was a quite effective way to rapidly build apps of questionable quality but great business value. Somebody should bring that paradigm back.


The paradigm never left :)

My day job is programming in an environment which originated in the mid 90s. A contemporary of the Visual Basic era, but somewhat more powerful, and requiring substantially less code.

While I, and a few thousand others still use it (and it gets updated every couple years or so) it has never been fashionable. Ironically because it's perceived as 'not real programming'.

We routinely build systems with hundreds of thousands of lines of code, much of it founded in the 90s and having been added to for 25 years. Most of it was built, and worked on, by individuals, or very small teams. Much of it today is still active doing the boring business software that keep the lights on.

But its not "main stream" because programmers pick language based on popularity, and enterprises pick programmers based on language. A self-fueling cycle of risk aversion.

A lucky few though hot off the treadmill a long time ago and "followed a path less travelled by". And that has made all the difference.


Excellent as long as you jump ship when the complexity is over 5kloc


I don't think you can be considered a programmer if you can't write your own syscall firmware code in assembly.


I don't think you can be considered a programmer if you can't perfurate your own punch cards.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: