My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

  • Max-P@lemmy.max-p.me
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    Lots of good answers already, but I’d also add, if you have the opportunity to go to a computer store like a microcenter or anywhere they have gaming monitors on demo, try one out for a few minutes, run a first person game if you can (there’s plenty of basic demos of them on the internet through WebGL), or run testufo in a browser.

    It’s really hard to imagine the smoothness without experiencing it, and it’s why a lot of people say once they experienced it, they can’t unsee it.

    24 is all you need to create the illusion of motion, and the brain fill the gaps, but when you control the motion, especially with a high precision mouse, it really breaks the illusion. Your brain can’t fill the gaps anymore, the motion can go anywhere at any time extremely fast. Like, even just dragging windows around on the desktop you can feel the difference. I instantly know when my 144Hz monitor isn’t running at 144. It also becomes a matter of responsiveness as the others said.

    High refresh rates are also more effective at higher resolutions, because at 30fps maybe the object will need to travel 100px per frame, but at 240fps, that same object will move 25px 4 times instead. It’s probably fine when you’re watching a TV somewhat far away, but when the monitor is 32 inches and 3 feet in front of you, you notice a lot more.

    • Kale@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      A decade ago I had a little extra money and chose to buy a 144 hz gaming monitor and video card. I don’t have great eyesight nor do I play games that require twitch reflexes, but at that time 144 hz frame rate (and configuring the game to be >100 fps) was very noticable. I’d much rather play 1080 at >100 fps rather than 4k at 60 fps or below.

      This may be different between people. I don’t believe I have great eyesight, depth perception, color perception, etc, but I am really sensitive to motion. I built my second computer (AMD Athlon 64 bit I think?) and spend a significant sum on a CRT that had higher refresh rates. I can’t use a CRT at 60Hz. I perceive the flicker and I get a headache after about 20 minutes. I couldn’t use Linux on that computer (I was stuck at 60 hz on that kernel/video driver) until I saved up even more to buy an LCD monitor. I can’t perceive a 60 hz flicker on an LCD, and 60Hz is fine for work.

      But for gaming, high refresh rate is noticable, even for someone that normally doesn’t notice visual stuff, like me.