Andrew Adamson
Got root? Sudoes.
It would depend on the card. A hardware encoder, whether a daughter card or a seperate chip on the graphics card, should be able to encode/decode (co-dec) on the fly and pass the info back to the CPU just as easily as it can throw it on your monitor. I read a story about a year and a half ago about a university that installed a bunch of graphics cards in a single computer (or possibly on a network of similar computers) and used them in parallel to solve complicated mathematical problems. They did it because graphics cards are optimized for floating point calculations and easily outstrip even top-level CPUs.a faster GPU doesn't help with DVD MPEG2 encoding, does it??
Edit: I found the original story I recalled above. The PDF of the study is here . The authors note a few shortcomings, most notable being the limitations on memory available on graphics cards. Very neat idea. Very hackerish. Dare I say it, even McGuyverish.
On the issue of multi-processor-based PCs and video encoding.... I'd love to see if the quicktime team (or anyone else) has done anything to take advantage of that. You need a program that has been specifically engineered to take advantage of the extra CPUs. It is a rare program indeed that hands the same task (such as 'encode video') to multiple processors. If a program supports multiple CPUs at all, usually it just throws the most intensive thread at one processor and does everything else it needs to do on the other. I'm sure that having easy access to Pixar staff would help make an encoding program that truly smokes, but I honestly doubt Quicktime will break a sweat with its encoding products. I hope I'm wrong.