I have a co-worker that maintains that TRUE used to be defined as 0 and all other values were FALSE. I could swear that every language I\'ve worked with, if you could even
It's easy to get confused when bash's true/false return statements are the other way around:
$ false; echo $?
1
$ true; echo $?
0
General rule:
Shells (DOS included) use "0" as "No Error"... not necessarily true.
Programming languages use non-zero to denote true.
That said, if you're in a language which lets your define TRUE of FALSE, define it and always use the constants.
In any language I've ever worked in (going back to BASIC in the late 70s), false has been considered 0 and true has been non-zero.
In languages like C there was no boolean value so you had to define your own. Could they have worked on a non-standard BOOL overrides?