H.264 on nothing but G5s

a faster GPU doesn't help with DVD MPEG2 encoding, does it??
It would depend on the card. A hardware encoder, whether a daughter card or a seperate chip on the graphics card, should be able to encode/decode (co-dec) on the fly and pass the info back to the CPU just as easily as it can throw it on your monitor. I read a story about a year and a half ago about a university that installed a bunch of graphics cards in a single computer (or possibly on a network of similar computers) and used them in parallel to solve complicated mathematical problems. They did it because graphics cards are optimized for floating point calculations and easily outstrip even top-level CPUs.

Edit: I found the original story I recalled above. The PDF of the study is here . The authors note a few shortcomings, most notable being the limitations on memory available on graphics cards. Very neat idea. Very hackerish. Dare I say it, even McGuyverish.

On the issue of multi-processor-based PCs and video encoding.... I'd love to see if the quicktime team (or anyone else) has done anything to take advantage of that. You need a program that has been specifically engineered to take advantage of the extra CPUs. It is a rare program indeed that hands the same task (such as 'encode video') to multiple processors. If a program supports multiple CPUs at all, usually it just throws the most intensive thread at one processor and does everything else it needs to do on the other. I'm sure that having easy access to Pixar staff would help make an encoding program that truly smokes, but I honestly doubt Quicktime will break a sweat with its encoding products. I hope I'm wrong.
 
ok. sorry. maybe i should have been more specific: most consumer/gamer video cards don't do encoding.

Also, apple seems to be very much aimed at not putting "special" hardware requirements on authoring, hence it's 100% Software encoding engine in thinks like Compressor etc..
 
Pengu said:
ok. sorry. maybe i should have been more specific: most consumer/gamer video cards don't do encoding.
True. Especially true for gamer cards. Though I am surprised at the number of 100% consumer cards that do. ATI All-In-Wonder is an example of one that does it out of the box.

The biggest problem with hardware encoders is that you are stuck with whatever codec they use onboard. You can forget DivX or 3ivX (I am counting the days before the DivX crew is sued out of existence). So if you want a highly compressed, non-lossy encode, you'd be better off with the software tools you've already got anyway, unless you want to re-encode with a better codec, which defeats the purpose of having a hardware encoder.

I used to work for a Canadian broadcaster, so there were hardware encoders built into everything but the coffee mugs. The entire building was kept in one piece by cables pulsing with mpeg-2. If these streams ever left the building, it was to a cable operator or a broadcast satellite that had identical encoders. It was a closed shop environment. The limitations of mpeg-2 (fat and, I think, choppy) were simply a fact of life and no one cared because it was the industry standard. We actually looked at having some sort of hardware interface to save the in-house feed to a computer, but gave up. The cost of the hardware that would do it was out of our league for our little web site. So we just took an analog signal from the in-house CC cable system and captured it with consumer cards, did software encodes and pushed it all into Windows Media encoders. Relatively fast and quite cheap.
 
Back
Top