> But somebody somewhere is reading this and thinking this is a "semantically correct pattern" (as it is introduced) and may just copy-paste it into their program.
I dare say that would be their fault for blindly copying and pasting without taking the time to understand the context. (He even gives an explicit disclaimer!) Robust error handling would just be more noise to filter through for people actually reading the article, and I don't think it's the author's responsibility to childproof things for people who aren't.
I'd agree, except that I've seen too many examples where people, particularly those who still have things to learn, cite blog snippets as authoritative. IMO we have something of a duty to those people to get it right. In this case it's not a lot of effort to get it right. Relying on the side-effects of an assert() is not getting it right.
The fact that I got a reply based on a misunderstanding of how asserts work tells me it's a point that needs to be made.
The fact that I got a reply based on a misunderstanding of how asserts work tells me it's a point that needs to be made.
I'm just a bystander, but I think you may be jumping to unfounded conclusions here. Based on previous comment history, I presume that 'masklinn' understands perfectly well how assert() works. Yes, if you define NDEBUG your error handling will go away. So don't define NDEBUG unless you want your error handling to go away!
By contrast, your assertion that Some compilers will set that for you in an optimized build strikes me as unlikely. Some program specific build systems do this, and if you use one of them you should be aware that your assert() functions may drop out. But I don't think I've ever used a compiler that drops the assert() based on optimization level.
I don't particularly disagree with your conclusion, just your argument. I think 'awda' gets closer to the truth: the default assert() from <assert.h> with its negative reliance on NDEBUG is tricky and probably best avoided -- not just for error handling but altogether. Personally, I use two distinct macros: ERROR_ASSERT() and DEBUG_ASSERT(). ERROR_ASSERT() cannot be disabled, and DEBUG_ASSERT() only runs if DEBUG was defined at compile time.
> your assertion that Some compilers will set that for you in an optimized build strikes me as unlikely
Uhh, I didn't make it up. I remember now what I was thinking of: the defaults for Visual Studio (not the compiler, the IDE) are to have -DNDEBUG in release mode. So lots of Windows projects end up having it without the authors explicitly asking.
(I thought I also once used a machine, maybe some obscure Unix, where cc would add it if you specified -O. I don't remember the details of that, or if I might be confusing it with what VS did.)
FWIW I don't think it's weird that assert has this quirk, I think some people in this discussion just disagree about what an assert is. If you think of it as an extra debug check that might not be evaluated and should not have side effects, and are fine with that conceptually, no problems.
I dare say that would be their fault for blindly copying and pasting without taking the time to understand the context. (He even gives an explicit disclaimer!) Robust error handling would just be more noise to filter through for people actually reading the article, and I don't think it's the author's responsibility to childproof things for people who aren't.