jcorsico wrote:The visual cortex can only process ~10 frames a second, so we may have an upper limit on the update value.
I cannot imagine anything that needs to update that fast as decision making is probably longer than 1/3 of second.
Hm? Not sure I understand this. If this statement were true, every movie, TV show, video game, etc, would have a frame rate of only 10 frames per second. But that's clearly not the case. Movies are shot at 24 frames per second, and look a bit jerky as a result. Video is typically 30 frames per second and look smoother.
If you talk to the video game guys, people spend tons of money on their computers to get the frames per second up above 50 frames per second, because that's where the game is the smoothest to the eye.
People want to record HD video at 1080p at 60 frames per second, because that's what looks the best.
The frame rate matters enourmously to how something looks. Three frames per second looks slow and jerky.
Regards,
Jon
If you turn the video rate down while looking at a still picture - then you will see flashing around 10-Hz.
"Update rate" and "video rate" are not the same.
If you have a movie then yes you will see jerking and the wheels going the wrong way differently at different video rates.
What is changing that fast? Not the fuel level or oil pressure or water temp.
Maybe the RPM and that looks OK.
When I fill up at the bowser I cannot make sense of the cents, nor the 10c values. Yes I can see the blur, but it is only the dollars that I can really seem to comprehend and that is the only way I know if the blur is increasing or decreasing. I recall it being easier when I was younger so maybe I am getting old.
Again "Update rate" and "video rate" are not the same.
I doubt that a higher update rate would result in any real gain - does the (LCD ?) display even flicker due to video rate?
I doubt it is anything like a CRT, and it probably has little in common with a gaming computer which is actually flashing.