It will probably help to consider what the world was like twenty years ago.
Back then, it wasn't as expensive to design and build world-class CPUs, and so many more companies had their own. What happened since is largely explainable by the increasing price of CPU design and fabs, which means that that which sold in very large quantities survived a lot better than that which didn't.
There were mainframes, mostly from IBM. These specialized in high throughput and reliability. You wouldn't do anything fancy with them, it being much more cost-effective to use lower-cost machines, but they were, and are, great for high-volume business-type transactions of the sort programmed in COBOL. Banks use a lot of these. These are specialized systems. Also, they run programs from way back, so compatibility with early IBM 360s, in architecture and OS, is much more important than compatibility with x86.
Back then, there were minicomputers, which were smaller than mainframes, generally easier to use, and larger than anything personal. These had their own CPUs and operating systems. I believe they were dying at the time, and they're mostly dead now. The premier minicomputer company, Digital Equipment Corporation, was eventually bought by Compaq, a PC maker. They tended to have special OSes.
There were also workstations, which were primarily intended as personal computers for people who needed a lot of computational power. They had considerably cleaner designed CPUs than Intel's in general, and at that time it meant they could run a lot faster. Another form of workstation was the Lisp Machine, available at least in the late 80s from Symbolics and Texas Instruments. These were CPUs designed to run Lisp efficiently. Some of these architectures remain, but as time went on it became much less cost-effective to keep these up. With the exception of Lisp machines, these tended to run versions of Unix.
The standard IBM-compatible personal computer of the time wasn't all that powerful, and the complexity of the Intel architecture held it back considerably. This has changed. The Macintoshes of the time ran on Motorola's 680x0 architectures, which offered significant advantages in computational power. Later, they moved to the PowerPC architecture pioneered by IBM workstations.
Embedded CPUs, as we know them now, date from the late 1970s. They were characterized by being complete low-end systems with a low chip count, preferably using little power. The Intel 8080, when it came out, was essentially a three-chip CPU, and required additional chips for ROM and RAM. The 8035 was one chip with a CPU, ROM, and RAM on board, correspondingly less powerful, but suitable for a great many applications.
Supercomputers had hand-designed CPUs, and were notable for making parallel computing as easy as possible as well as the optimization of the CPU for (mostly) floating-point multiplication.
Since then, mainframes have stayed in their niche, very successfully, and minicomputer and workstations have been squeezed badly. Some workstation CPUs stay around, partly for historical reasons. Macintoshes eventually moved from PowerPC to Intel, although IIRC the PowerPC lives on in Xbox 360 and some IBM machines. The expense of keeping a good OS up to date grew, and modern non-mainframe systems tend to run either Microsoft Windows or Linux.
Embedded computers have also gotten better. There's still small and cheap chips, but the ARM architecture has become increasingly important. It was in some early netbooks, and is in the iPhone, iPad, and many comparable devices. It has the virtue of being reasonably powerful with low power consumption, which makes it very well suited for portable devices.
The other sort of CPU you'll run into on common systems is the GPU, which is designed to do high-speed specialized parallel processing. There are software platforms to allow programming those to do other things, taking advantage of their strengths.
The difference between desktop and server versions of operating systems is no longer fundamental. Usually, both will have the same underlying OS, but the interface level will be far different. A desktop or laptop is designed to be easily usable by one user, while a server needs to be administered by one person who's also administering a whole lot of other servers.
I'll take a stab at mixed core, but I might not be accurate (corrections welcome). The Sony Playstation 3 has a strange processor, with different cores specialized for different purposes. Theoretically, this is very efficient. More practically, it's very hard to program a mixed-core system, and they're rather specialized. I don't think this concept has a particularly bright future, but it's doing nice things for Sony sales in the present.