Note bene: I realize this is an immensely complicated question with about a million levels of nuance that I\'m trying to reduce to a single number...
I\
Compression Ratio Rules of Thumb Compression ratios to maintain excellent quality: – 10:1 for general images using JPEG – 30:1 for general video using H.263 and MPEG-2 – 50:1 for general video using H.264 / MPEG-4 AVC
from http://www.kanecomputing.co.uk/pdfs/compression_ratio_rules_of_thumb.pdf
Don't forget normal playback MPEG will only use YUV 4:2:0. In 8-bit color depth each pixel only worth 16-bit(or 64bit each 4 pixels). Please, only RAW file from camera will use 16-bit depth, and it must worth millions USD!!Medium high Film DVR only can provide 12-14bit!!And nobody will use H.264 to store RAW. H.264 is deigned for end product.
In 640x360/24p YUV4:2:0 the bitrate will worth:
640x360x24x(8+4+4)/8 = 10.5MB/s
For 500Kbps the compression will be only 172:1. It's normal
For expatiation of YUV4:2:0, read:
http://en.wikipedia.org/wiki/Chroma_subsampling
Just sharing my knowledge over encoding with H264 environment
As for the ratio 450-512 kbits/second is best if you use H264 with High 5.0 or High 7.0. Well i can suggest you that to get a good ratio in balance with the best quality , the key that really matters to it is to play with the Resolution size. Result of Video Resoltion = 3/4 * Native's / Raw Video's Resolution
H264 tends to loose details more if you don't compress the frame into a little bit of smaller resolution.
In a quite interesting document called H.264 Primer, a simple formula is given as an hint to compute the `ideal' output file bitrate, based on the video's characteristics:
[image width] x [image height] x [framerate] x [motion rank] x 0.07 = [desired bitrate]
where the image width and height is expressed in pixels, and the motion rank is an integer between 1 and 4, 1 being low motion, 2 being medium motion, and 4 being high motion (motion being the amount of image data that is changing between frames, see the linked document for more information).
So for instance, if we take a 1280x720 video at 24 FPS, with medium motion (movie with slow camera movements, not many scene changes...), the expected ideal bitrate would be:
1280 x 720 x 24 x 2 x 0.07 = 3,096,576 bps => approximatively 3000 kbps
This is purely a hint, and in my opinion, the only way to accurately find the ideal bitrate is trial by error :)
It will vary dramatically depending on the content of the source videos. I'll get to that in a bit.
640x360 is not that large. 512kbps is very reasonable and arguably standard. Maybe 768kbps if you are really interested in quality.
How is this possible? A simplified answer: There are a couple of techniques and facts about video compression that make this possible:
It is all wildly more complex than that, with zillions of different approaches, techniques and algorithms within specific CODECs and between CODECs to make this happen.
So, back to the "It will vary dramatically depending on the content of the source videos" comment: The compression ratio you'll see, and the resulting quality, will depend significantly upon:
Example: A video of a door in a room (like a security camera) with one key frame every ten minutes is going to have an amazingly high compression ratio. My back-of-the-napkin calculations put that scenario at 15,000:1 compression.
Since you are starting on a large video encoding project, I would recommend a couple of things to determine what your compression ratio is going to be:
Changing the parameters of the encoder to make the videos smaller can have other impacts too:
It's a big complicated subject. Good luck. My experienced "thumb-to-the-wind" test says you'll be more than happy with 512-768kbps for your project.