Side effects are a necessary evil, and one should seek to minimize/localize them.
Other comments on the thread say effect-free programming is sometimes not as intuitive, but I think that what people consider "intuitive" is largely a result of their prior experience, and most people's experience has a heavy imperative bias. Mainstream tools are becoming more and more functional each day, because people are discovering that effect-free programming leads to fewer bugs (though admittedly sometimes a new/different class of bugs) due to less possibility of separate components interacting via effects.
Almost no one has mentioned performance, and effect-free programming usually has worse performance than effectful, since computers are von-Neumann machines that are designed to work well with effects (rather than being designed to work well with lambdas). Now that we're in the midst of the multi-core revolution, this may change the game as people discover they need to take advantage of cores to gain perf, and whereas parallelization sometimes takes a rocket-scientist to get right effect-fully, it can be easy to get right when you're effect-free.