The \"classic\" approach to web development has been for some time a thin client and a thick server: the server generates HTML and spits it out for the browser to render only. B
Your assertion that web developers can now "pretty much assume their Javascript code is working" is a tough one to agree with. In my experience Javascript is almost always a black hole sucking all the time and energy you can supply it. Frameworks like Prototype and Script.aculo.us have made things MUCH better, but they are not yet as hardened as your question assumes.
The two main issues are one, browser support and two is development time. You are relying on an application you cannot control to handle the bulk of your app's work load. The fact this can be broken with even the most minor update to the browser would concern me. Generating HTML server-side mitigates this risk to a large extent. Development of a rich Javascript front-end is time consuming, difficult to debug and equally difficult to test across the wide array of available browsers.
While those concerns are real, the fact you can achieve some fantastic User Experiences via client-side Javascript cannot be ignored. The frameworks I mentioned earlier expose functionality that was not even dreamed of a year or two ago and as a result make the up front development price in some cases largely worth it (and sometimes significantly shortened when the frameworks are implemented effectively).
I think there are applications for A Javascript-powered UI, as long as the decision to go that route is well thought-out. We would not be discussing this on SO were it not for the fact that the UI potential using this strategy is awesome. Web-based applicationsusing web-based data are perfect candiates (RSS, REST Services). Applications hitting a relation database or complex Web services repeadly are going to by necessity maintain a tighter coupling with the server-side.
My 2 cents.