PDA

View Full Version : some facts about frame rates!!!!



06-18-2001, 10:34 PM
#1 the human eye can only see 32 frames per second
#2for high frame rates you must have a high refresh rate for example most tvs have between 50hz to 60hz and expensive ones have 100hz and they can display vga desktop more stably than normal teles
(legible and readable)
#3turn OFF VERTICAL SYNC in display properties
#4 or get POWERSTRIP utility to do it for u
and force up your refresh rate
WARNING
(this can reduce moniters life and create flickering.that why bad moniters give you saw eyes.)
#5its not just the video card that helps frame rate but the cpu does most of the geomentry for the structul frame so the video card can paint in the textures
thats why if you put a ge force 3 with a celeron a 366 it wont make much diffrence exept it will look prettier but with modern video cards taking strain off processer
(ie T&L from nvidia)this may increase.
although support for these feature can be scarse.
#6in the end high frame rate just shows the video cards potentiol for rendering more graphically intense scenes
(ie higher res,32 bit colour, anti aliasing)
#7finally DONT BE A CHEAP SKATE ON YOUR MONITER IT IS THE LEAST UPGRADED BIT IN YOUR SYSTEM YOU LOOK AT IT CONSTANTLY IT CAN SAVE YOU HAVING SOUR EYES AND CAN SHOW YOU YOUR HIGHER FRAME RATES
THINGS TO LOOK FOR ARE:
size(atleast 17 inch)refresh rate(at least 75hz) built in on screen options (more the merry)flatness (the flatter the better)
and ask yourself is it worth buying second rate to save $100.

IVE GOT 21 INCH FLATTY AT 100HZ AND IM NEVER GOING BACK!!!!!!!!!!!!!!!!!! http://www.opengl.org/discussion_boards/ubb/smile.gif

Chromebender
06-19-2001, 05:35 AM
This probably isn't necessary, but lest anyone else become as confused as this person:

'fact' 1 is wrong: Most people can perceive flicker between 70 and 75 Hz. The Video Electronics Standards Association (VESA) recommends 85 Hz for a satisfactory image.

'fact'2 is wrong: Most monitors have a default refresh rate of 60 Hz (not 50, and not tvs).

'facts' 3 and 4 are wrong: turning off vsync does not increase your monitor's refresh rate above its capacity. It simply allows the graphics card to draw to the monitor regardless of whether or not the monitor is currently in the process of refresh. That is why you get a 'jagged' screen. Part of the screen has been refreshed to the new image, part has not.

'fact' 6 is wrong: I think it was probably supposed to read 'high end resolution', not 'high end frame rate'.

P.S. Buy a monitor that can refresh at 120 Hz. at high resolutin so you can use 3D glasses at 60 Hz. if you ever want to.

06-19-2001, 09:47 PM
fact 1
refers to the eye only being able to see only 32 frames
persecond and nothing to do with moniter frequcy
it is just a fact that any more than 32 frames a second is negligible to the eye

fact 2
most televisions (not tvs)if you read it do not go above 60hz
so when displaying vga with tv out they look crap
(i should have been more precise)
fact3&4

i do not say that turning off vertical sync increases your frame rate
(i should have clarifyed that sorry )although it is a good thing
windows 9x is very ****ty at handling moniter refresh rates and installing
moniters that is why i suggested powerstrip to do the above options

fact 6
reads: "in the end,high frame rates"
not
"high end frame rate"

ive never had expeirence with owning 3d glasse and was un awares that,
that was the case.
i checked my moniter again and it does support above 100hz

P.S
im sorry that my facts may be a bit confusing.
but i didnt really check the wording properly like i should
you seem to know hell of a lot about it all, and hope you can make a revised fact list,
as there seems to be a lot postings about frame rates and refrech rates which all
seem to have people confused it would probly be of great help
thank you for your corrections as i do not want lead any one astray http://www.opengl.org/discussion_boards/ubb/smile.gif

harsman
06-20-2001, 03:13 AM
Actually, fact 1 _is_ wrong. It's common to see this though, it comes from the fact that that's the "refresh rate" of the rods and cones in the eye. The problem with the conclusion is that unlike a monitor, the rods and cones don't submit info serially, but in paralell. It's perfectly possible for a normal person to discern between 30 and 60 fps. The difference isn't as great as most hardcore gamers think though.

Chromebender
06-20-2001, 11:27 AM
Yes, I should have been clearer as well.

When gamers say that they want framerates above 60 fps, they're probably responding to the fact that many games key animation to framerate, so as framerate increases, the game can feel more responsive. They can't really tell the difference between the refresh rates, but they can observe the smoother animation.

06-21-2001, 01:42 AM
Originally posted by Chromebender:
They can't really tell the difference between the refresh rates, but they can observe the smoother animation.
That is a contradiction.
I think the gamers just want to test how fast the hardware is.

06-21-2001, 02:58 AM
this all comes down to the question of whether benchmarking hardware using frame rate comparasions is really as applicable as it seems i know it can show the potentional troughput of the video cards power but is not really representive of the quality of the image displayed (ie: 3dfx implementation of FSAA or super sampling as compared to Nvidias first attempt of anti-aliasing showed a fair bit of diffrence in picture quiality. although with Nvidias purchase of 3dfx and the coinciding improvments in the geforce3 (coinsindents or not) implementation of this process is much improved over the previous models.
as well as ATI being more interested in innovation rather raw speed is another exsample of this in bench marking comparsions is not often showen. )
so GENERALLY speaking it seems to be a bit of a trade between raw speed which helps in frame rate
OR
picture quality

i think ive gone of the track http://www.opengl.org/discussion_boards/ubb/confused.gif myself

06-21-2001, 03:01 AM
this all comes down to the question of whether benchmarking hardware using frame rate comparasions is really as applicable as it seems i know it can show the potentional troughput of the video cards power but is not really representive of the quality of the image displayed (ie: 3dfx implementation of FSAA or super sampling as compared to Nvidias first attempt of anti-aliasing showed a fair bit of diffrence in picture quiality. although with Nvidias purchase of 3dfx and the coinciding improvments in the geforce3 (coinsindents or not) implementation of this process is much improved over the previous models.
as well as ATI being more interested in innovation rather raw speed is another exsample of this in bench marking comparsions is not often showen. )
so GENERALLY speaking it seems to be a bit of a trade between raw speed which helps in frame rate
OR
picture quality

i think ive gone of the track http://www.opengl.org/discussion_boards/ubb/confused.gif myself

HighCv2
06-21-2001, 04:06 AM
LOL, yes u did go off track :P

However if you're a gamer the framerate difference does count. A simple example of this being CS. When u run around u do upto 100 fps, my monitor allows for 75 hz at 1024x768, so I am limited there to 75 fps. However the second I run around or through the smoke grenade, the framerate cuts in half. U start cursing at the lamer who threw it in the first place. But what a difference response wise, u can hardly turn in smoke and that happening at around 30 fps. I agree that framerate isn't everything, however my eye/hand coordination is completely off balance when running and shooting in smoke :*)

Chromebender
06-21-2001, 10:36 AM
The smoke grenade thing is a good example of why a lot of people turn vsync off:

Suppose I am able to run a game at exactly 60 fps with vsync disabled. Then some #@$@& llama throws a smoke grenade that, for the sake of argument, increases the scene complexity by a factor of 2.001. With vsync turned off, I get something like 29.999 fps. However, with vsync turned ON, the graphics card often finds that it can't quite finish rendering the scene before the monitor needs to refresh, causing an unnecessary delay of an additional refresh. The FPS might then be as low as 20 fps, a loss of 33%.

Complex factors such as rapidly fluctuating polycount, whether or not vsync is enabled, and whether or not stereo hardware is used are some reasons why people want framerates above the 'perceivable' 60 fps.

HighCv2
06-22-2001, 04:11 AM
Very nice post Chrome http://www.opengl.org/discussion_boards/ubb/smile.gif

However I do experience some tearing with VSync off, therefore I leave it on.

06-29-2001, 07:30 PM
if you got a fast card turn on vertical sync
looks a lot better