Right out of grad school at Lucent I was given a compiler and interpreter to maintain, written in PL/I. The language being compiled described a complex set of integrity constraints, and the interpreter ran those constraints against a large data set that would later be formed into the initial database controlling the 4ESS switch. The 4ESS was, and still is, a circuit-switch for long distance voice traffic.
The code was a jumble. There was a label to branch to called "NORTH40". I asked the original developer what it meant.
"That's where the range checks are done, you know, the checks to make sure each field has a correct value."
"But why 'NORTH40'?"
"You know, 'Home, home on the range.'"
"Huh?"
Turned out 'NORTH40' meant a farm's north 40 acres, which in his city-bred mind had an obscure connection to a cattle ranch.
Another module had two parallel arrays called TORY and DIREC, which were updated in parallel and so were an obviously misguided attempt to model a single array containing pairs of data. I couldn't figure out the names and asked the developer. Turned out they were meant to be read together: "direc-tory". Great.
The poor guys that had to write the integrity constraints had no user manual to guide them, just an oral tradition plus hand-written notes. Worse, the compiler did no syntax-checking, and very often they wound up specifying a constraint that was mapped quietly into the wrong logic, often with very bad consequences.
Worse was to come. As I delved into the bowels of the interpreter, I discovered it was built around a giant sort process. Before the sort was an input process that generated the sort input data from the raw data and the integrity constraints. So if you had a table with 5,000 trunk definitions in it, and each trunk record had three field values that had to be unique across the whole input data set, the input process would create 3 * 5,000 = 15,000 sort input records, each a raw data record prefixed by the integrity constraint number and a copy of the field value to be sorted. Instead of doing three 5,000 record sorts, it did one 15,000 record sort. By the time you factored in hundreds of intra- and inter-table integrity constraints, and some very large tables, you had a combinatorial nightmare.
I was able to refactor a bit, document the language, and add syntax checking with comprehensible error messages, but a few months later jumped at a transfer to a new group.