As for framerate... technically, the human eye has a persistence of vision cap at around 20fps. Movies are all 24fps, and they look great. The reason behind 60fps (fields per second), is because it can use AC power as a regulator, which cycles at 60 times per second. Early CRTs couldn't keep a pixel lit for a whole cycle without looking flickery, which is why they invented interlacing. So 60fps is kind of unnecessary, just one of those standards like QWERTY keyboards, that is there for no other reason than issues with outdated hardware.
I was doing my best to just ignore this comment because I knew responding to it could not possibly be brief, but then somebody actually went and took it seriously. There is so much wrong in these few sentences that I don't even know where to begin. Heck, sometimes there are multiple factual errors in one sentence!
But I'll give it a shot.
"technically, the human eye has a persistence of vision cap at around 20fps."
Utter nonsense. The human eye does not have a 'persistence of vision cap' at all. It doesn't even see in term of frames per second (not 'fields per second'...whatever the heck that is) in the first place. However there does come a point where the brain is not able to distinguish between framerates. That point is nowhere near 20fps, it's roughly 60fps but for some people it's even higher than that.
"Movies are all 24fps, and they look great."
Movies are 24fps, I'll give you that. This is the result of a standard developed in the 1920s, and frankly there is room for improvement. The only reason movies look good at 24fps is because of motion blur, which is to say that each image was not taken from a still scene and then spliced together, but rather when filming something in motion you get a series of blurry frames and your brain fills in the detail to understand what it's looking at. This happens naturally when using film because of the exposure time.
However, motion blur doesn't come for free. You are literally looking at blurry images, and there is an associated loss of detail. For instance there are times when it is impossible to read text on things that are moving in a movie when from a corresponding position in real life you would have been able to. But these kinds of effects are subtle and most people don't notice. There have been actual studies showing motion-heavy movies to people recorded at higher framerates and yes they absolutely can
tell the difference in a side-by-side comparison. And interesting point, though, is that people have become so used to looking at 24fps movies that when shown something at a higher framerate they will often comment that it looks 'fake' or 'cheap' despite the fact that they're seeing more detail. 24fps is how people think
movies are supposed to look.
BTW, it is possible to do motion blur in video games but the easiest way to do it is to interpolate between frames, which is generally more work and doesn't look as good as just running at a higher framerate in the first place. That's why nobody bothers.
"The reason behind 60fps (fields per second), is because it can use AC power as a regulator, which cycles at 60 times per second."
Complete and utter nonsense. AC power has nothing to do with framerates. Most electronics converts to DC power anyway, but even without taking that into account framerate != refresh rate.
"Early CRTs couldn't keep a pixel lit for a whole cycle without looking flickery, which is why they invented interlacing."
First of all, interlacing is a total red herring here. Second of all, you are again confusing framerate and refresh rate. A CRT with a refresh rate of 60Hz absolutely does look flickery. How much it bothers you depends on the person, but I can't even look at a 60Hz screen. It hurts my eyes. That's why most (if not all...) CRTS run at more than 60Hz. The refresh rate refers only to the scan time for the electron beam. A CRT running at 120Hz displaying 60fps would refresh twice for each frame. That kind of thing is completely normal for a CRT.
"So 60fps is kind of unnecessary, just one of those standards like like QWERTY keyboards, that is there for no other reason than issues with outdated hardware."
Given that everything you said to support this statement is nonsense, I guess it shouldn't be a surprise that your conclusion is nonsense.Some more reading for you