There’s an interesting editorial at Mikhailtech regarding FPS, refresh rates and what the brain really distinguishes. Although he has no scientific background experience he has assembled some interesting facts. You can get more information on how the brain interprets visual data here.My Opinion: It is important to clearly understand the difference between refresh rates (screen updates, relates to screen flickering) and FPS (amount of pictures per second, relates to animation smoothness) for understanding how the brain works. For example the movies you normally see in a cinema have 24FPS at 48Hz. Having a higher FPS than your refresh rate is however total nonesense. Regarding games on computer display certain experts told me that 40FPS (due to sharper less transistional images) at 85Hz (due to lower persistence of phosphor) could be considered ideal.
The most common “frame rate” on a television set is 24, with ranges that can go from 18-30 (these are approximations). I use quotes because TVs don’t work the same way as computer screens. A television set doesn’t render individual frames; instead it provides a range.
I stopped reading here. TV’s work at a fixed frame rate, not a ‘range’. 29.97 fps (not 30 due to NTSC color burst) in the US/Japan and 25 for Europe (PAL). For each ‘frame’ there are two fields (odd and even) that are displayed interlaced giving each standard an effective ‘image frequency’ of ~60Hz and 50Hz. This 0v3r-cl0ck3r needs to spend more time studying and less time benchmarking his Q3 rig.
The technical “limit” would be anything ranging from the speed of light to how long it takes it to reach your eyes to how fast your brain can interpret it.
The human eye definitely does have an upper limit bound by the chemicals that it uses for photoprocessing. About 10 times a second the human eye flushes with these chemicals that allow the organ to detect photons at all. This isn’t to say that you ‘see’ at 10Hz, but that your eyes can’t sustain an infinite amount of information.
On top of all that, your eye doesn’t ‘see’ per frame. It interpolates between multiple frames to enhance whatever detail you are focusing on. This can take a fraction of a second, but it is real and can be exploited to ‘hide’ subliminal messages in movies, tv, etc. (currently illegal here in the US).
This article is a joke.
Had you continued, your conscience would have prevented you from such a cavalier dismissal of it.
That error has little relevance to his larger point about how the human eye (or, really, the visual cortex) perceives motion. I suggest you go back and read what you missed, and then fashion your rebuttal.
I can’t wait to see how BEOS gets introduced into this discussion.
“BEOS is so great I can run 14 simultaneous mpeg2 streams each at 100FPS on my p200 while compiling source and still never have a single frame skip.”
“Yeah, well my 14mhz Amiga could do that 15 years ago!”
“Nooooo! You guys are all evil. Those are all proprietary solutions. Had they been covered by the GPL they would still be alive today. BTW Jesus was pro-GPL. I recall him telling me this in a dream”
that 50-60fps is smooth… but generally anything above 30fps (minimum) is acceptable…
Anything above 100fps+ just seems a waste of resources, especially when it can be used by things like sound and AI. I would prefer to have 50fps and a good AI/soundtrack, than 100fps and a crap AI and soundtrack…
> “BEOS is so great I can run 14 simultaneous mpeg2
> streams each at 100FPS on my p200 while compiling source
> and still never have a single frame skip.”
> “Yeah, well my 14mhz Amiga could do that 15 years ago!”
An 14 mhz Amiga can only run standard MPEG-1 movies (VideoCD) at full speed and quality (24 bit genlockable graphics) with a Full Motion Video module installed.
Without such a module decoding movies takes far too much processor horse power. You can only watch MPEG-2 fully in software on high-end classic Amiga models with fast enough CPUs.
Boy you sure fell for that one didn’t you?
What all people forget about framerates is:
Simple question:
Your Monitor runs with a vertical refresh rate of, let’s say 85Hz. What does this mean?
Answer:
It means, that the whole picture on your screen is redrawn 85 times per second. That’s a technical fact.
Fact:
The big error is to think that you can get a visual higher framerate than 85Hz on that screen.
You say:
Hey, wait, but the FPS display in my game says I run 230FPS. I bought an expensive GForce 4Ti to get these values.
Fact:
Yes, thats right! The GPU of your GForce and your wather-cooled, overclocked Athlon XP 😉 running at 2.5GHz are able to recalculate the 3D environment of your 3D game 230 times in the memory of your GForce. But then somthing unfair happened:
The analog part of your GForce – the Digital/Analog Converter (RAMDAC) grabs the content of your memory on your GForce only 85 times per second and sends the analog display-signals at that rate to your screen.
So people, as long as you don’t get an 230Hz refresh on your screen, you are not event able to see 230 FPS.
The benefit of higher Framerates is mostly a marketing issue, invented by the marketing guys of the graphic-card manufacturers. They want you to by a new card to get that 1850 FPS that the card of your neighbor has, too.
That’s the real fact about framerates.
Ralf.
Funny, rajan r was the first to call me “Bourma” instead of “Bouma”… Is it meant to be insulting or something? If English is your native language can anyone please enlighten me?
> Boy you sure fell for that one didn’t you?
I just like a healthy discussion. Please contribute instead of trolling.
I can’t wait to see how BEOS gets introduced into this discussion. He he, but I wouldn’t mind seeing some teapot benchmarks.
no subliminal message hiding in movies…. this is aaaaage old nonesense, go help yourselves, Google is your friend; Don’t carry on this fairy tail in the third millennium.
> I can’t wait to see how BEOS gets introduced into this
> discussion.
OK, BeOS is a great operating system. IMO it would have been a great foundation for MacOS X instead of a slow Mach kernel.
> “BEOS is so great I can run 14 simultaneous mpeg2
> streams each at 100FPS on my p200 while compiling source
> and still never have a single frame skip.”
The fact is that ordinary Hollywood movies are recorded with only 24 frames per second. That means if you watch it on BeOS or AmigaOS at a normal speed you will only get 24 FPS. If you would get 200 FPS you would be viewing the movie at very fast “fast forward” speeds.
However almost nobody complains after going to a cinema or watching a Hollywood movie on a ordinary television. Stating things like for instance “that movie was playing really jerky”.
However almost nobody complains after going to a cinema or watching a Hollywood movie on a ordinary television. Stating things like for instance “that movie was playing really jerky”.
I guess than I’m almost nobody…
If there’s a moderately fast camery pan of eg. a bright wall of a house before a darker background, I see it jerk like hell. And it does annoy me. Especially in cinema where there is no phosphore afterglow.
Anyone else seeing this?
because this guy’s comments about TVs (seems he has not heard of progressive scan), LCDs (‘they’re progressive’ … err, ok… with that analog tuner input eh?), Brightness vs Darkness (TV set with 100Hz refresh, no clue on super-black), motion blur vs sharpness (what an idiotic comparison to make, love the stupidity of the ‘still shots’ comment), …. are some of the most absurd I have ever seen. There should be a law against articles such as this.
Because your NTSC set is doing 59.9Hz @ 29.97fps, but your source content is 23.976fps. To go between the two, you need an extra frame every 4 frames. If you have a decent TV set, it may be able to do 3:2 pulldown (aka televised cinema aka telecine) in hardware, removing most (visibly, best you to have a progressive scan TV) of the effect. If not, you will see it in all its glory (jerk, jerk, jerk).
No, this article was no joke. He included some interesting facts regardless that he doens’t have a sufficient related scientific background to fully interpret the data. IMO he should be applauded for his efforts!
If you read hardware reviews or marketing benchmarks you will notice they always ignore the limitations of the human brain and visual input sensors.
IMO scientists should debate with graphic card manufacturers about an optimal frame rate/refresh rate for games. Then instead of trying to get 1000 FPS at 1000 Hz they could then concentrate mainly on image quality instead and more features.
FPS superior of the refresh rate doesn’t matter?
Technically it’s true because you can’t see it.
But anyone except the most stupid use this number as an indication of the performance of the graphic card which will be usefull when you will have newer games which can use the increased power.
Before the R300, no videocard could be used at 1600*1024 with 4*FSAA and anisotropic filtering with a solid 60 FPS, so more performance for the videocard was necessary.
Now the R300 is mostly there, but I don’t think that it will be able to render Doom3 at this speed..
For the performance mesurement you mean the value of triangles per second is much more important that FPS. The value of triangles per second that your card can render is what makes Doom3 fly or make it choppy. Via this value you can exact estimate how fast your game will run. But if you have two cards which run, lets say Unreal at 65 and 90 FPS, you have no real guaranty that they can run Doom3 with a FPS above 30. Not the 60FPS card nor the 90FPS card. The game engines of new/old games are too different to compare.
Ralf.
(I think before I post!)
> But anyone except the most stupid use this number as an
> indication of the performance of the graphic card
No, there are many people who are claiming to see a difference. Only when you put them in front of two monitors with the first one for example showing a game running at 50 FPS at 100 Hz and the other at one showing the same game at 200 FPS/200 Hz the amount of correct picks for picking the one with the highest FPS rates is around 50%. This 50% is simply the same percentage of chance.
> which will be usefull when you will have newer games
> which can use the increased power.
Partially I agree. However there was recently a Matrox Parhelia review for example, using the amount of FPS produced in a mode without FSAA enabled. And they compared this to other cards not supporting FSAA. As the Parhelia chip performs very well with FSAA enabled and future game are likely to support FSAA the benchmarks of these modes are very misleading with regard to future gaming performance/quality.
Although these mode benchmarks could be considered relevant for current and older games still not supporting FSAA. However the benchmarks of the Parhelia already clearly produces excellent benchmarks results for current and past game titles.
> No, there are many people who are claiming to see a
> difference.
I know some people who claimed to see the difference between 100 FPS and 200 FPS. When I told them that the display they were using was only refreshing at 60 HZ, they silenced and accepted that they were wrong.
Such an effect is what we call a placebo effect in the medical world. Sometimes doctors give patients a fake and harmless medicin to cure the patient. Studies have shown incredible medical results by usage of placebo medicin. In the end only the result is what matters.
Here’s the reason I think that hardware/game makers have been obsessed with driving FPS very high. This happens in Windows, but might happen elsewhere (dunno). Being that you can not take over the machine and/or OS entirely, i.e., only use high-priority threads at full screen at best, the OS still does other things in the background. So you sometimes get to see hiccups in the framerate, especially, say, after having loaded a level from hard disk and the OS is still managing caches and stuff. After awhile of no disk access the hiccups become less and less and eventually they disappear (on powerful enough machines). I am not referring here to the case where a camera angle may result in rendering more polys than is OK for the framerate, although this could be used as another example. Thus, framerate obsession occurs due to wanting to eliminate frame hiccups. I do not claim truth here, it is just my belief… Think about it, if there were no hiccups, then running your game at 30fps would be OK. You would get used to use and didn’t really notice in the long run.
All of this brings an interesting point: if I could do hiccup-free 30fps, do you have an idea of how much more free *CPU* time do I get. Because the CPU takes part in rendering for it tells the graphics card (“GPU”) what to draw and where, exactly.
Funny, rajan r was the first to call me “Bourma” instead of “Bouma”… Is it meant to be insulting or something? If English is your native language can anyone please enlighten me?
I mispelled it a lot of times (sorry :-P). I have no idea what Bouma nor Bourma means. Trust me on that one.
“However almost nobody complains after going to a cinema or watching a Hollywood movie on a ordinary
television. Stating things like for instance “that movie was playing
really jerky”.”
You can get away with a lower frame rate in a cinema because the light
level is much lower than on a TV screen. (That’s why the cinema has to
be darkened, while a TV can be watched in normal room lighting.)
A film projector actually opens the shutter twice for every frame, so
the frame rate is 24 per sec, but the flicker rate is 48 per sec.
Early or toy projectors without this trick were obviously flickery.
Likewise on TV, the frame rate is 25 or 30 per sec, but with interlace
the field rate is 50 or 60 per sec, so flicker is not so bad. However,
because the screen is so bright, many people can see flicker on a TV,
especially in the bright parts of the scene.
My guess is that 100 images per sec would be flicker free for
everybody, but motion blur in the images (which is automatic in film
and video) gives noticeably smoother movement. The question is, can it
be calculated in time? In 3D rendering (Lightwave etc), motion blur is
very CPU-intensive.
>>In 3D rendering (Lightwave etc), motion blur is
>>very CPU-intensive.
Don’t confuse redering (wich 3D games does) with raytracing (wich Lightwave does).
These two are complete different processes with different goals. Rendering is about speed, raytracing is about photo-realism.
Ralf.
Hey, I still have an a1200 sitting around here someplace. Sold the 060 accel card on ebay for $450 a couple years ago. Nice little machine – I didn’t much like the idea of having to pay for a tcp/ip stack though, or the oodles of other software which I only had in demo mode. That’s likely why I quit using it.
As for my previous comments I was only interested in stiring the pot up a bit.
> I didn’t much like the idea of having to pay for a
> tcp/ip stack though
Well, at least AmigaOS4 will include a TCP/IP stack as standard. Just like AmigaOS 3.5 / 3.9 did. When AmigaOS 3.0 (1992) and 3.1 (1993) were released it was a very different market and still far less people were using the internet.
It must have been somewhere around 1991 at an age of 14 when I first used the internet.
> It must have been somewhere around 1991 at an age of 14
> when I first used the internet.
I just read some old threads from around 1990 on comp.sys.amiga.misc and it is just hilarious, especially if you take into account modern computing!
People complaining about AmigaOS 2.0 needing *1 MB* of RAM to run or people thinking that the A2000 is unelegant because of the “enormous” size of the box (roughly around the same size of any PC nowadays). Good reading
I’ll go with e1 on his/her thoughts. If I have a graphics card that can push 200fps in an ’empty’ area, and move into an area that’s ‘heavily populated’, even if it drops to 100 or 90fps, as long as it’s still smooth it would be good.
Spoken like a man whose graphics card is inadequate Sure you may get 230 fps at 640×480 at 16bit color in Quake III, but that same graphics card will only get 40fps at 1024×768 in Doom III. That’s the real reason gamers buy fast graphics cards, not some marketing induced “my d**ks bigger than yours” stereotype. And try running 3D Studio on your ATI Rage Pro. Even my GF4MX only gets a few fps on complex scences.
No, there are many people who are claiming to see a difference. Only when you put them in front of two monitors with the first one for example showing a game running at 50 FPS at 100 Hz and the other at one showing the same game at 200 FPS/200 Hz the amount of correct picks for picking the one with the highest FPS rates is around 50%. This 50% is simply the same percentage of chance.
>>>>>>>>>
70% of all statistics are made up on the spot What I’d like to know is where you got that monitor that refreshes at 200 Hz.
> 70% of all statistics are made up on the spot
No. Most of the values found in the article and in my comments are not made up.
> What I’d like to know is where you got that monitor that
> refreshes at 200 Hz.
The absolute upper limit of the monitor’s frame rate is specified by the maximum vertical refresh frequency. Yes there are video cards and monitors (for example LG, NEC monitors or professional Philips projectors) available supporting 200 Hz refresh rates.
> Then instead of trying to get 1000 FPS at 1000 Hz they
> could then concentrate mainly on image quality instead
> and more features.
Only these figures are made up by me, as I just wanted to make a point. But I thought that was obivious, or not?
Funny, rajan r was the first to call me “Bourma” instead of “Bouma”… Is it meant to be insulting or something? If English is your native language can anyone please enlighten me?
What you’ve never heard of a typo? If someone mispelling your name is the worst thing to happen to you online then I envy you.
I just like a healthy discussion. Please contribute instead of trolling.
Contibute to what? The discussion you were hoping to start with your article, or the flamebait you took about Amiga’s? And I’m the troll huh?
After seeing everybody bashing the author, I went there and actually read the article (ooh, horror! shock!) Don’t assume, though, I’ll change my saner style of commenting without reading… 🙂
Now, folks, when the guy says TVs have a range of refresh rates, I guess he’s talking about different TV standards across countries, ok? Not the same TV with varying vertical refresh rates, me thinks so… %^P
All in all, he looks mostly right, even when he says that “no proven limit to human…”. Of course, speed of light is vastly superior to cone eye cell reaction time or nerve electrochemical signal transmission. And, indeed, some biologist or M.D. probably already did a study about human vision, not to mention other professionals (like those measuring worker productivity in plants). But the point is, I suppose, human perception has not been maxed out.
And he even mentions the people with 100+ refresh rates will not perceive degradation in adverse conditions (complex picture, increased resolution etc.)
Now, the lesson I learn when I read such articles is to get a well-thought *opinion* about how technical, cost-related, human-related, usage-related factors work together.
So, in this case, my choice would be to increase the resolution up to the point where a scene full of players shooting with particles, bullets, dynamic ligthing, etc. still don’t cause the fps to fall, say, below 30.
If that means I’ll get 200 fps in a 72 Hz still picture of a dark corner — ok, it’ll be overkill and I’ll live with that.
BTW, IMHO, one of the coolest things Is doing better than I thought would be possible on old or limited hardware.
“And it does annoy me. Especially in cinema where there is no phosphore afterglow. Anyone else seeing this?”
Of course, its annoying as hell.. Anytime it pans past something like a guy in a black suit against a bright background or something its gratingly obvious. Even in moderately slow moving shots of scenery you can tell.
I think you guys are talking about something different than me. I meant the animation smoothness. Yesterday is went to the cinema and everything was looking to move very smoothly.
I was not talking about screen flickering and in fact I stated in one of the comments of a OSNews Parhelia article earlier that Cinema screens aren’t entirely flicker free.
It is very important to understand the difference between refresh rates (screen flickering) and FPS (motion smoothness) for understand this issue (Although the refresh rate can also limit the FPS).
So IMO 24 FPS in Hollywood movies are perfectly smooth for me.
It is very important to understand the difference between refresh rates (screen flickering) and FPS (motion smoothness) for understand this issue (Although the refresh rate can also limit the FPS).
pherthyl and I are definately talking about FPS and not flickering. Camera pan: Object moves from position A to position B on the screen in one second. This gives you 24 steps between step A and B. These steps could be 10 cm apart, which we think is too much. 48 FPS would decrease this to 5 cm, 96 FPS to 2,5 cm…
Do I miss something?
OK then you see something that I don’t. Personally I have never heard people complain while watching a Hollywood movie at 24 FPS (on i.e. DVD) on a good television or computer.
Also interesting is for instance that if I move a flashlight incredibly fast in a dark room, I see bright lines instead of a dot of light moving very fast. I believe this must be caused by the limitation of my human visual perception.
So IMO there are clear limits to our human perception.
A moving flashlight in a dark room will appear as a streak because the
visual system takes time to see.
The best source I know for information on these things is Cornsweet’s
book on visual perception. It’s out of print but you should be able to
find it in a library.
> A moving flashlight in a dark room will appear as a
> streak because the visual system takes time to see.
If that would be the only reason the visual data interpretation would simply get delayed.
However it is IMO very interesting how easily our brain can be misleaded by motion. Take a look for instance at the following spinning wheel. Just stare at the center for half a minute and then look at something else.
http://www.grand-illusions.com/Optical.exe
Notice #1:
Do you like “unfocused”/blurred text and details?
If so – lets’go.
Problem is simple – frequency responce characteristics of analog part from GPU to CRT.
with mentioned above resolutions number, your hardware should allow non-distorted trheothput of squarec pulces with frequency about 273 MHz. It means, in order to avoid blur in small details, throuthput range should be 5-10 times those 270 MHz. In ALL parts on this way. Better (in that sence ) cards from Matrox had very good throughput up to 500-600 MHz, if i recall correctly. Monitor are usually worse, especially inteded for wide consumer-base. Also it means very high power-consuption rise and heat pollution, which also causes lot of problem for that hardware.
Notice #2. Interference with indoor lightning, especially luminescent, and with 50/60 (Europe/USA) Hz electrical current radiation and its harmonics (100/120 Hz resp).
This interference brings “jerking” with difference frequency, e.g. for 90 Hz refresh rate in Europe you will be affected with “blinking” at 40 (good) and 10 (VERY bad) Hz.
So, don’t believe always in this marketing/advertisement by HW manufacturers, believe your brain, and look around your workplace.
PS.LCD monitors are different, so those notices cover CRT case.