Originally posted by codiac2600 What you guys are misreading is Hz and FPS.
Actually there's a much greater connection than you think. You see, when television was not yet digital, the system(s) used the electric current frequency to synchronise. The frequency is 60Hz (110V) for the US and 50Hz(220V) for Europe. Therefore we have 60 FIELDS (not FRAMES) per second for NTSC (60HZ), and 50 fields per second for PAL/SECAM (50Hz). The FIELD is actually
half a frame with each field containg the even or odd lines of a frame only. Two consecutive fields make a FRAME - although in no interlaced system is a complete frame ever projected. The technique was effective and is still in use, but it produces many artifacts, mostly the famous "zebra" artifact and of course flickering.
Progressive systems do show the entire frame at once, hence the talk about 1080p frames per second.
The flicker frequency of you monitor or refresh rate as some call it, with modern technology displays is rather obsolete (pixels in TFT do not have to flicker) and applicable only to displays that actually have yet analog inputs (all TVs, most PC monitors). What it has become to mean is how many frames can your monitor display every second (again no flickering).
When all video becomes digital, monitor refresh rates will eventually be abandoned and replaced by actual fps.