Recently I hear Steve Jobs citing this as a main reason to not have Flash on IPADS
In the general case, Flash isn't hard on the CPU, Flash contents are hard on the CPU. People don't usually use Flash to display static text and bitmaps, they use it for vector animations and video and RIAs with custom-skinned components, and compositing all those vectors and gradients and alpha channels takes CPU - regardless of whether you use Flash or HTML5 or Silverlight or JavaFX or whatever. Feel free to go check out the demos at this animation comparison, and see how the CPU usage of the HTML5 version stacks up against the Flash version. Results vary a lot by OS and browser, but for me (winXP/firefox), the Flash version uses roughly the same CPU as the canvas version (~50%), while giving a little more than double the FPS.
The second answer concerns video specifically, and the answer is this: no matter what technology you use to display video, only two things really affect the CPU usage: the codec, and whether or not it's hardware accelerated. You can test H264 video across browsers and OSs and Flash/HTML5 (and people have done so), and what you find is that CPU is low when video is hardware accelerated, and it's high when it's not, both inside Flash and out. So there's no issue of whether Flash video is hard on the CPU, the only question is whether Flash video is using HWA or not. Check the link for full details, but basically unless you're using OSX/Safari, Flash video uses the same CPU or less than other options.
Incidentally, if you were also wondering why Flash has historically used more CPU on macs than on PCs, even for non-video contents, see here for a lot of details - both on why it's been worse in the past and why it's improving with Flash 10.1. The quick version is that Apple has added newer and better ways for plugins to draw into the browser.