Originally posted by vanguard
It's my understanding that those very high frame rates (115, 135, whatever) are achieved at low resolutions. If you prefer to play at 1025, 1152, etc. than the frame rates will drop to levels where you can actually see the difference between mac and pc performance.
That's true -- lower resolutions usually equate to higher frame rates... but my point is that if you're running your monitor, at, say, 640x480 or 800x600 at
85Hz then, pretty much, the highest frame rate you'll be able to "see" is 85fps, because your monitor, running at 85Hz, is only showing you 85fps. Likewise, if you run your monitor with a refresh rate of 100Hz, then 100fps is just about all you're gonna see. The game may report to you that it's banging out 200fps, but you're only "seeing" 100fps because your monitor is only refreshing the screen 100 times a second.
For example, let's say I've got a monitor running 1024x768 @ 85Hz and I'm playing Quake III. Quake III tells me I'm pulling, say, 100fps. Well, lemme swap my card out for a billy-bad-ass card, and run my monitor at 1024x768 @ 85Hz and load up Quake III again. Quake III now tells me I'm pulling, say, 200fps. Well, my eyes can't tell the difference between 100fps and 200fps in that setup, because my monitor is only showing me 85Hz, or 85fps basically.
So, let's now say I've got a PC and a Mac sitting side-by-side, both running Quake III. The PC monitor is set to 1024x768 @ 85Hz, and so is the Mac monitor (that's a common monitor resolution/refresh). The PC reports that Quake III is doing 150fps. The Mac reports that Quake III is doing 100fps. Either way, I'm only REALLY "seeing" 85fps due to the refresh of the monitor (85Hz = 85 refreshes per second). Barely a perceivable difference at all between the machines.