So When Is Enough Enought Frames Per Second Anyway!!!!

SoulCollector

Registered
I mean see the benchmarsk for Quake on the G5 and Xeons..its like 240 to 325 fps..thas ridiculous.......up to when would you think....you dont even notice the lag??????
Thats what I wanna know.........do we need up to that much FPS for games like that..man Im lucky to get 40 or maybe even 30 on my Dual 1ghz
1.4Gig of ram and a old Geforce 4 MX...

So i mean up to like 70-90 average is nice... right??????
how much more smoother do we need...
basically just wanted to know how many FPS do we really need to not notice hows running more frams then another computer?????

just thinking..
G5Pimp......also can you imagine...if the G4..didnt have a bottleneck......??? I think it would compete to a P4 ..toe to toe..well then you have the megahertz gap soo who knows......Damn Moto for making a freaking Bottleneck I feel it everytime I play a damn..3d pushed game..DAMN EM patteuuuuuyyyy
 
I, for one, will not be satisfied until my computer pre-cognitively renders my games' scenes before they happen.
 
Originally posted by Jet
i think that the human eye can't tell the difference between 60 fps and up
On an LCD screen no you wouldn't notice, but with a CRT monitor, if you change the refresh rate from 60 Hz, to something higher, there's a definite difference: your eyes won't hurt as much :)

FPS is basically a measurement of performance that the marketing departments like to throw around to impress people. It's an easy enough concept that all their customers can understand, and companies use that as leverage to sell a product.
 
Any FPS rating higher than the screen's refresh rate is pointless - the card is just going to drop frames, since only one frame goes on the screen per screen refresh.

However, a stupidly high (i.e. faster than your screen can possibly redraw itself) frame rate on today's games does mean there's a good chance you'll be getting at least pleasantly playable (i.e. somewhat over 60 Hz) frame rates on games a couple of years from now.
 
If you can achieve 300 fps, then you can crank up all the graphic options and set the resolution to the max your monitor will support and the game will still be very playable. That's what those high numbers achieve for you.
 
It also means the less impact more complex scenes will have on your gameplay experience. This is particularly important when playing multiplayer games where you don't want a lot of skipped framerates so you know what's going on onscreen at all times and want to make sure your system can always handle your input consistently. Being able to run the game with good performance at higher resolutions is good because if you're going for a headshot in a game, at 640x480 the head might only be a couple of pixels to aim at. But at 1600x1200, there's more pixels to aim at, and a slightly large amount of precision aiming you can do. Add into that smooth gameplay and your frag count could go up :)
 
It's like maximal speed of your car or its HP... it's always more than enough ! but even in this case, more is better !!!
 
Thanks for the reply guys, SO then that being said..........Does that Mean the G5 will be a screaming GAMING machine..I MEAN damn look at the Bandwith and the front side Bus..its just Crazyyyyyyyyyyy....Will this architecture.....Do us Gamers Justice??????Also one more question...I have a Dual 1ghz.......G4.......blah blah..
I want to get the G5..and yes in due time..
But would like to also get the 9800pro...how much more...power does the Dual 1.42 have on my Dual 1ghz??? I mean in gaming wise...and would I see a drastic Performance from My Crappy Geforce 4MX?? Cuz man Im really wanting to get it......And please Break down Drastic for me??? If Anyone already has it.??????

Payceeeee..G5pimp
 
Most games now depend more on the card than on the CPU, and all cards are old when 6 months old... there is no ultimate game machine.
 
And keep in mind that movies (as in Hollywood, big screen) are only 24 fps, and NTSC videotaped programs are only 30fps.

That said, I often find it distracting in a theater when theres a lot of side-to-side camera movement - I see the flicker. bleah.
 
Originally posted by SoulCollector
Thanks for the reply guys, SO then that being said..........Does that Mean the G5 will be a screaming GAMING machine..I MEAN damn look at the Bandwith and the front side Bus..its just Crazyyyyyyyyyyy....Will this architecture.....Do us Gamers Justice??????
Trick a G5, any G5, out with a top-of-the-line graphics card and lots of RAM, and it will shred any game you throw at it without hesitation.
Also one more question...I have a Dual 1ghz.......G4.......blah blah..
I want to get the G5..and yes in due time..
But would like to also get the 9800pro...how much more...power does the Dual 1.42 have on my Dual 1ghz??? I mean in gaming wise...and would I see a drastic Performance from My Crappy Geforce 4MX?? Cuz man Im really wanting to get it......And please Break down Drastic for me??? If Anyone already has it.??????
Okay, I'm confused... what are you asking? Could you try to be a little clearer, like you were writing an essay or something? I'd like to help you, but I don't understand what you're trying to say. :confused:
Payceeeee..G5pimp
Ha, don't you wish. ;) :rolleyes: ::alien::
 
Originally posted by brianleahy
And keep in mind that movies (as in Hollywood, big screen) are only 24 fps, and NTSC videotaped programs are only 30fps.

That said, I often find it distracting in a theater when theres a lot of side-to-side camera movement - I see the flicker. bleah.

Movies also have motion blur in the frames which ends up helping with the illusion of fluid movement at lower framerates.
The interlaced nature of TVs helps hide the 30fps along with the motion blur.
 
The human eye is incapable of accurately seeing a difference above 60-80fps. If you did a completely "blind" test with the same game on the same computer (out of the room where the monitor was) of 60fps and 120fps, I doubt anyone could actually tell the difference. There may be "special circumstances" for certain effects, but in general it's a mood point. It's a mental thing more likely, similar to a doctor using a plecebo. Your mind plays tricks on you because you already know/think there's a difference.

In other words, if you get 60fps or so at the resolution you want with no problems, then that's all you need.

Someone else mentioned this, and it is similar. Why would you need a Corvette Z06 that's capable of doing 0-60mph in 4.3 seconds with a 170mph top speed? The fastest you can go on the majority of US roads is between 50 and 70. I can attest to having gone 150 in a buddies 01 Pontiac Firehawk before, and I will tell you, it's downright scary. Esp. if you understand the mechanics involved in a wreck and the forces exherted. In fact, it's something I'll never do again in a car that isn't a professionally designed race car on a track.

Interesting Tidbits:
------------------------
* National Television Standards Committee (NTSC) format: Used in the United States, Canada, Japan and elsewhere . (30fps)
* Phase Alternation by Line (PAL) format: Used in European countries and other parts of the world. (25fps)
* Systeme Couleur Avec Memoire (SECAM) format: Mostly used in Eastern European countries. (25fps)
* HDTV is approx 60fps

There's never been a true independant scientific test...most are done by manufacturers in which the idea of it mattering is in there best interest. NVIDA and ATI want you to buy their latest card afterall.
 
Believe me, I can tell the difference between 60 fps and 120 fps. The 120 fps is the one that makes my eyes hurt less. Or is that refresh rate for the monitor? In any case, the more frames you can push, the better suited your computer is to handle rendering complex scenes with lots of particle effects, objects, etc. at high resolution with lots of details and graphic goodies.

And Chevy brought up the car thing. In that case, more is not always better.
 
Well thanks for all the replies guys,
I was just wondering........If I should stay with my Dual 1ghz and buy a Radeon9800Pro....i mean I do photoshop and graphic design alot but then again at the end of the day i want to play some games...would this setup give me a reallly good gaming experience or just save and fork up for the Dual G5....i mean no more bottleneck..and we can really see the apple shine..on games..right???? LET ME KNOW..argh this Geforce 4MX is not cutting it for me...
G5pimpage...
 
You'll probably notice a lot more responsiveness in a G5 than your current system, and this is probably what Apple is hoping for, but unless you need more speed and less waiting time, then you'll be fine with your current setup. If you want to upgrade your graphics card, go for it. If later you decide that you'd like the G5 after all, you can always drop the 9800 into it and go back to your old card on the G4.

Currently, the fastest machine I have to use for games is a 400 Mhz G3 iMac. So there.
 
Hmm Arden ......are you sure I can go and put the Radeon from my Dual system to the G5..becuase that fact....on mine isnt it only 4AGP slot and on the G5 its 8AGP?????
hmmmm
 
SoulCollector, the 9800 is a 8x card but will work fine in 4x, just at the slower rate <G>. Arden was trying to say if you decide to go with a G5 after that, you'll have the right card. The BTO option for the 9800 is a $300 increase on the dual 2g G5, and $350 on the 1.6 nd 1.8g mg5s. In other words...it might save you money. www.owcomputing.com has the 4x/2x version of the 9800 for $387, didn't see a 8x version listed. So what Arden said may or may not work, depends on finding an 8x version 9800. May just be a typo on their site or ?!?

I would personally look at it this way....what do you use the mac for 1st, 2nd, then 3rd? Do you need the increase in speed the G5 will bring you for any or all of those uses? What is the financial situation currently, and does it warrant buying a new computer? You could get between $1400 to $1800 (rough estimate) maybe for your current system. A g5 1.8g with a ATI9600 (50$ up) is $2450. If you sold yours for $1500 and bout the g5 for $2450, you need to find $950.

The ati card alone was $390, so you'd be spending $560 more for the g5 than upgrading your current system but you'd be getting alot more too. I'd personally buy the g5, it would make better sense in my situation when considering those factors if I had the same computer you currently do.
 
Back
Top