Why HTML/JavaScript/CSS are not becoming compiled languages (or maybe even merge into a single compiled language)? What if browsers were running \"Browser Virtual Machine\" and
Ah, but Javascript IS becoming a compiled language. Check out Firefox 3.5 with TraceMonkey. It's insanely fast compared to um you-know-who's browser. It's true that JS will never be C, but it's a much more dynamic language than C is, and in many ways that makes it more expressive and powerful.
As far as HTML goes, I don't think that the lack of validity of HTML is a huge detriment to speed. I think the engines that put together the visual representation and manipulate the DOM need to get a lot better (um, IE, I'm looking in your general direction...). CSS compliance needs to get better, and CSS itself needs to get more powerful. (Get on the bus with CSS 3 people!)
But I do think that speed is going to get better on Firefox and Chrome to such an extent that people really ARE going to start using it for mainstream application development. It's funny. Adobe seems to be selling Flash as their platform for dynamic web content, MSFT is selling Silverlight for dynamic web content, and Google just wants to really improve HTML and Javascript to display dynamic web content. And Google's doing pretty well at it so far, I must say...
I think your idea is sound, however there's still no way to enforce a standard. Thus if there was a non-supported feature, there's a good chance the entire page would simply not display anything. In the current setup, critical information can still be passed.
Your ideas have validity when they are applied to JavaScript. As others have noted, to one degree or another several vendors are trying to apply those principles to JS even now. Another big step in this area will likely be the Chrome OS Google has announced. However, when it comes to (X)HTML and CSS I think your ideas may be missing the point.
The world wide web is not a buggy and inconsistent application platform but a massive and unprecedented collection of interconnected documents. The power of the web is in the abstraction of the data from the often rigid (and breakable) visual layouts and increasingly complex in-page functionality largely provided via JavaScript. Encoding these pages in (X)HTML is ideal for making them accessible to the widest possible audience both in terms of browsers and in terms of technical knowledge required to author a page.
More and more the web is being used as an application platform - which is a powerful and exciting use of this technology - but we cannot lose sight of the fact that these Ajax-driven "web 2.0" apps are merely documents with extended functionality. Compilation doesn't make sense for a document and compression is already happening (via gzip and the like).
On a more practical note, the W3C moves at a glacier's pace and browser vendors take turns between jumping-the-gun supporting experimental features in unfinished specs and taking their sweet time supporting other specs which have been on the table and in common usage for years. The whole processes is like herding cats. I wouldn't hold my breath for them to make the kind of radical changes you're proposing any time soon.
The V8 javascript engine (also embedded in Google Chrome, but it's open-source and liberally licensed so you're welcome to use it in the next browser you write!) does compile Javascript to native machine code -- of course, it does it "just in time" (like most modern compilers -- Java, C#, etc!), not "ahead of time" (like Fortran did in 1954 when computers were just too weak to handle compilation in the midst of execution). I'd be surprised if other good JS engines, like those in the very latest Firefox and Safari, didn't do the same.
Looks like you're not advocating "javascript as a compiled language" (since it obviously already IS compiled, if you're using a good JS engine), but rather "ahead-of-time" compilation for it (just when most modern languages are essentially abandoning ahead-of-time compilation). Pushing machine code rather than compilable code down the wire sounds like a mostly horrible idea -- much larger size, difficulties in supporting one CPU vs another, security nightmares in properly sandboxing it, etc, etc) with not much in term of compensating benefits.
That said, if you're really keen on pushing machine code to the client, try out nativeclient (as long as the client is an x86 machine - forget every smart phone on the planet, many netbooks, good old macs, etc) -- at least it promises a fix to the security nightmares. If and when you're happy with nativeclient, transforming a just-in-time compiler into an ahead-of-time one is a far easier technical challenge (if you want to keep using Javascript for the sources rather than other languages, of course).
See here for a previous discussion on the matter
Not all of the reasons given are necessarily valid, but one important one is that, unless you're Google, server-side CPU cycles are a lot more valuable than client-side cycles: so it's easier to have the client compile/optimize what is quite often dynamically generated HTML/JavaScript, rather than the server.
Ken
Speed.
You're assuming that it takes significant time to parse HTML. However it might be that that time is insignificant compared to the time required for something else, e.g. the time required to layout the text on the end-user's window.
No more "loose" and "half-correct" html. It is either correct or won't compile.
You already get that, using [X]HTML.
Looks the same in every (supported) browser.
You seem to be saying that there should only be one browser, or that all browsers would support it equally.
Internet standards don't happen by having a single body (the w3c) implementing something and declaring it a standard. Instead, internet standards happen by having multiple independent bodies creating multiple implementations. A consequence is: