I know that Lisp and Scheme programmers usually say that eval
should be avoided unless strictly necessary. I’ve seen the same recommendation for several program
I like Zak's answer very much and he has gotten at the essence of the matter: eval is used when you are writing a new language, a script, or modification of a language. He doesn't really explain further so I will give an example:
(eval (read-line))
In this simple Lisp program, the user is prompted for input and then whatever they enter is evaluated. For this to work the entire set of symbol definitions has to be present if the program is compiled, because you have no idea which functions the user might enter, so you have to include them all. That means that if you compile this simple program, the resulting binary will be gigantic.
As a matter of principle, you can't even consider this a compilable statement for this reason. In general, once you use eval, you are operating in an interpreted environment, and the code can no longer be compiled. If you do not use eval then you can compile a Lisp or Scheme program just like a C program. Therefore, you want to make sure you want and need to be in an interpreted environment before committing to using eval.