G5 vs AMD Opteron

THAT isn't a gaming test! That is DirectX VS OpenGL BS not to mention that the Opteron actually was using an improved Radeon card! :p

PLUS, what will happen if both machines will run with 4GB of RAM or we will compare a G5 Dual 2GHz against an Opteron Dual 2.2GHz?
 
Those benchmarks aren't made with the same conditions. G5 is better and in the case they would be similar, Apple software is FAR superior than Microsoft's one.
 
Those benchmarks are VERY MUCH made with the same conditions. And the Mac looks good, too! Why don't you read the tests' text before going all angry and spreading MacMishMush like "We better than you!"?
 
The UT2003 shows that the Opteron is actually faster. In the botmatch, the graphics card doesn't make much of a difference as it is essentially CPU bound. If you're familiar with UT2003, you'll know that the physics and AI take a lot of processing power, which is why on a slower machine, scaling back the AI difficulty results in a speed increase. Also look at the graphs comparing a G5 with a Radeon 9600 vs a G5 with Radeon 9800. There really isn't much difference in the botmatch as the CPU is the same.

But its interesting to see the G5 doing so well.
 
I find it interesting that the 64-Bit hype is still prevalent even though neither machine is helped one single bit by being 64-bit (yet).

My understanding is that UT2004 is supposed to be 64-bit optimized (or will be). Did anyone else hear that? My understanding is also that games will be one of the biggest (and earliest) beneficiaries of 64-bit computing. Great, another reason for PCs to plow ahead.

I no longer take those reports on their own, but rather balance them against the whole library of tests from all sources. As we know, one source will tweak their tests for one result, and vice versa. Barefeats, IMO, has show itself to be slightly Mac-biased, which is fine by me in one sense and just more koolaid in another sense. I no longer drink the koolaid, but I am happy to see that, on balance, the G5s are holding their own at 2/3 clock speed. Pretty impressive.
 
mindbend said:
I find it interesting that the 64-Bit hype is still prevalent even though neither machine is helped one single bit by being 64-bit (yet).

My take on this is that the "64-bit" touting gives manufactures a more impressive-sounding way to say "next generation processor". It's true that both platforms have very serious improvements, but it's much easier to market bit-ness than "improved pipelining".

Marketing this way is technically misleading. For 90% of users that DO buy the platform, the 64-bitness buys very little, if anything. However, I think the overall "impression" of a much more powerful processor that the marketing creates is generally accurate. Personally I've just accepted the inaccurate marketing as mostly harmless and, in the end, generating the impression that it's intended to - the processor is "better" than the older processors (I say "mostly harmless" because all Joe User is going to care about is the fact that the processor is better, not really generate any meaningful interpretation of what "bitness" actually means). I don't think it's really any worse than other recent processor marketing techniques...
 
It's also a way for Apple to show the direction they're heading in. It's like the chicken-egg thing: if nobody has a 64-bit computer, why should software makers make 64-bit software? Now that Apple is mass-producing and mass-marketing a 64-bit computer, 64-bit software will follow.
 
http://www.firingsquad.com/features/sweeney_interview/default.asp

That's an interview with Tim Sweeney (from Epic Games) on porting Unreal Tournament to 64 bits. Note the main performance gains come from the additional registers (16 instead of 8) and on-die memory controller.

64bits really doesn't have anything to do with that, but as Ripcord said, 64 bit is easier to sell than "More registers, faster memory controller" :)
 
The scary part about the 64-bit thing isn't so much what the respective marketing engines do with it (an obvious and easy mechanism), but that a shocking number of "reviews" tout 64-bit as something currently meaningful. Of course, I don't have an example right in front of me, bu a month ago I read one right off Apple website headlines (might even still be there FAIK) discussing FCP 4. The review made claims that FCP has been optimized for the G5 (true) to especially take advantage of the 64-bit horsepower (FALSE!). This is obviously just flat out wrong. Some of these reviewers, pundits, "experts" and others ned to get knowledge worked out.

I wouldn't mind some sort of extended truth in advertising legislation that covers these technical situations. For example, Apple and Athlon's current ads would have to say "64-Bit* Computing Power. [bottom] *Currently 64-bit doesn't give you jack-diddly in performance, but we're hoping someday to make use of it. We appreciate the reviewers of our products for propogating the myth that 64-bit really means something."
 
I don't know for sure, but what i expect they mean by FCP being updated for 64bit is probably just so it can use more than 4GB of RAM when editing movies. Having more of the movie in memory should make editing a lot faster.
 
mindbend said:
I wouldn't mind some sort of extended truth in advertising legislation that covers these technical situations. For example, Apple and Athlon's current ads would have to say "64-Bit* Computing Power. [bottom] *Currently 64-bit doesn't give you jack-diddly in performance, but we're hoping someday to make use of it. We appreciate the reviewers of our products for propogating the myth that 64-bit really means something."

Hey, marketing type people need to make a living you know? Stuff like that will get them sacked instantly :)
 
Back
Top