Like I’m not one of THOSE. I know higher = better with framerates.

BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.

The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!

… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.

Yet like.

I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.

And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?

  • dontbelievethis@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    16 hours ago

    Probably consistency.

    Zelda was 20 fps, but it was 20 fps all the time so your brain adjusted to it. You could get lost in the world and story and forget you were playing a game.

    Modern games fluctuate so much that it pulls you right out.

  • dukeofdummies@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    18 hours ago

    Well a good chunk of it is that older games only had ONE way they were played. ONE framerate, ONE resolution. They optimized for that.

    Nowadays they plan for 60, 120, and if you have less too bad. Upgrade for better results.

  • AdrianTheFrog@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    20 hours ago

    Game design is a big part of this too. Particularly first person or other fine camera control feels very bad when mouse movement is lagging.

    I agree with what the other commenters are saying too, if it feels awful at 45 fps your 0.1% low frame rate is probably like 10 fps

  • Ironfacebuster@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    23 hours ago

    I think it has something to do with frame times.

    I’m not an expert but I feel like I remember hearing that low framerate high frametime feels worse than low framerate low frametime. Something to do with the delay it takes to actually display the low framerate?

    As some of the other comments mentioned, it’s probably also the framerate dropping in general too.

  • Yermaw@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    The display being at a higher resolution doesnt help either. Running retro games on my fancy flatscreen hi-def massive TV makes them look and feel so much worse than on the smaller fuzzy CRT screens of the time.

    I can’t stand modern games with lower frame rates. I had to give up on Avowed and a few other late titles on the series S because it makes me feel sick when turning the camera. I assume most of the later titles on xbox will be doing this as theyre starting to push what the systems are capable of and the series S can’t really cope as well.

    • Toes♀@ani.social
      link
      fedilink
      arrow-up
      0
      ·
      21 hours ago

      Are you using an OLED screen?

      I had to tinker with mine a fair bit before my PS1 looked good on it.

  • nanook@friendica.eskimo.com
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    @VinesNFluff Most people can’t honestly perceive any change in their visual field in less than 1/60th of a second except perhaps at the very periphery (for some reason rods are faster than cones and there are more rods in your peripheral vision) and even then not in much detail. So honestly, frame rates above 60 fps don’t really buy you anything except bragging rights.

        • binom@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 day ago

          led lights almost always run off rectified, smoothed supplies, so no flicker. i have one cheap old crappy bulb that doesn’t, and i can definitely perceive the flicker. it’s almost nauseating.

          • nanook@friendica.eskimo.com
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            1 day ago

            @binom If you film with a camera with a ntsc vertical reference rate of 59.95 hz you will see a beat note between the lights and the led lighting indicating it is not well filtered if at all. If you have a newer HiDef camera, most of them work at a 24Hz refresh rate, that IS a slow enough rate that you see jitter in the movement, they also will have a beat note if recording under most LED lights. Many cheap led lights just have a capacitive current limiter and that’s it. If you power them off of 50Hz you will see the flicker, if you get dimmable LED lights they will NOT have a filter. But I don’t want to interfere with anyone’s bragging rights.

  • overload@sopuli.xyz
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    I think player expectations play a big role here. It’s because you grew up with 20fps on ocarina of time that you accept how it looks.

    I’m pretty sure that game is not a locked 20 FPS and can jump around a bit between 15-20, so the argument that it is locked 20 and so feels smooth doesn’t really convince me.

    If that game came out today as an indie game it would be getting trashed for its performance.

    • Count Regal Inkwell@pawb.socialOP
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      13 hours ago

      Funny

      I played a lot of Lunistice some time back. It’s a retro 3D platformer that has an option to cap the framerate at 20 for a “more authentic retro feel”. Fun lil’ game, even if I eventually uncapped the framerate because it’s also a high-speed and precision platformer and doing that at 20FPS is dizzying.

      And yes absolutely Zelda 64 chokes on its 20 frames from time to time. I played it enough (again, yearly tradition, which started when I first finished the duology in the mid-aughts) to know that.

      But it wouldn’t change the fact that its absolute maximum is 20 and it still doesn’t feel bad to play.

  • Raltoid@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 day ago

    Stuttering, but mostly it’s the FPS changing.

    Lock the FPS to below the lowest point where it lags, and suddenly it wont feel as bad since it’s consistent.

  • FreedomAdvocate@lemmy.net.au
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    If you’re not playing on a VRR display 45fps *will * be horrible and stuttery. 30fps locked would feel significantly better.

  • Baggie@lemmy.zip
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    It depends on what you’re running, but often if the frame rate is rock solid and consistent it helps it feel a lot less stuttery. Fallout games are not known for their stability and well functioning unfortunately.

    For comparison, deltarune came out a few days ago, that’s locked to 30 fps. Sure it’s not a full 3D game or anything, but there’s a lot of complex motion in the battles and it’s not an issue at all. Compared to something like bloodborne or the recent Zeldas, even after getting used to the frame rate they feel awful because they’re stuttering all the damn time.

  • Crozekiel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    Part of it is about how close you are to the target FPS. They likely made the old N64 games to run somewhere around 24 FPS since that was an extremely common “frame rate” for CRT TVs common at the time. Therefore, the animations of, well, basically everything that moves in the game can be tuned to that frame rate. It would probably look like jank crap if they made the animations have 120 frames for 1 second of animation, but they didn’t.

    On to Fallout 4… Oh boy. Bethesda jank. Creation engine game speed is tied to frame rate. They had several problems with the launch of Fallout76 because if you had a really powerful computer and unlocked your frame rate, you would be moving 2-3 times faster than you should have been. It’s a funny little thing to do in a single-player game, but absolutely devastating in a multi-player game. So, if your machine is chugging a bit and the frame rate slows down, it isn’t just your visual rate of new images appearing that is slowing down, it’s the speed at which the entire game does stuff that slowed down. It feels bad.

    And also, as others have said, frame time, dropped frames, and how stable the frame rate is makes a huge difference too in how it “feels”.

  • palordrolap@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    Two things that haven’t quite been mentioned yet:

    1) Real life has effectively infinite FPS, so you might expect that the closer a game is to reality, the higher your brain wants the FPS to be in order for it to make sense. This might not be true for everyone, but I imagine it could be for some people.

    More likely: 2) If you’ve played other things at high FPS you might be used to it on a computer screen, so when something is below that performance, it just doesn’t look right.

    These might not be entirely accurate on their own and factors of these and other things mentioned elsewhere might be at play.

    Source: Kind of an inversion of the above: I can’t focus properly if games are set higher than 30FPS; It feels like my eyes are being torn in different directions. But then, the games I play are old or deliberately blocky, so they’re not particularly “real” looking, and I don’t have much trouble with real life’s “infinite” FPS.

  • Ghostalmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Some old games are still pretty rough with their original frame rate. I recently played 4 player golden eye on an n64, and that frame rate was pretty tough to deal with. I had to retrain my brain to process that.

    • Crozekiel@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      Out of curiosity, did you have an actual N64 hooked up to a modern TV? A lot of those old games meant to be played on a CRT will look like absolute dog shit on a modern LCD panel. Text is harder to read, it is harder to tell what a shape is supposed to be, it’s really bad.

  • lurch (he/him)@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    old games animations were sometimes made frame by frame. like the guy who drew the character pixel by pixel was like “and in the next frame of this attack the sword will be here”

  • tomkatt@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.

    Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).

    Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.

    • Lojcs@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 day ago

      First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that’s a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.

      Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts

      Edit: to be clear, this is the screen’s refresh rate, the game doesn’t need to run at hfr to benefit.

      • tomkatt@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 day ago

        I don’t understand all the technicals myself but it has to do with the way every pixel in an OLED is individually self-lit. Pixel transitions can be essentially instant, but due to the lack of any ghosting whatsoever, it can make low frame motion look very stilted.

        Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts

        That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.

        The LCD latency has to do with input polling and timing based on display latency and polling rates. Also, there’s added latency from things like wireless controllers as well.

        The actual frame rate of the game isn’t necessarily relevant, as if you have a game at 60 Hz in a 120 Hz display and enable black frame insertion, you will have reduced input latency at 60 fps due to doubling the refresh rate on the display, increasing polling rate as it’s tied to frame timing. Black frame insertion or frame doubling doubles the frame, cutting input delay roughly in half (not quite that because of overhead, but hopefully you get the idea).

        This is why, for example, the Steam deck OLED has lower input latency than the original Steam Deck. It can run up to 90Hz instead of 60, and even at lowered Hz has reduced input latency.

        Also, regarding LCD, I was more referring to TVs since we’re talking about old games (I assumed consoles). Modern TVs have a lot of post process compared to monitors, and in a lot of cases there’s gonna be some delay because it’s not always possible to turn it all off. Lowest latency TVs I know are LG as low as 8 or 9ms, while Sony tends to be awful and between 20 and 40 ms even in “game mode” with processing disabled.

        • moody@lemmings.world
          link
          fedilink
          arrow-up
          0
          ·
          1 day ago

          That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.

          Essentially, the speed of the beam determined how many lines you could display, and the more lines you tried to display, the slower the screen was able to refresh. So higher resolutions would have lower max refresh rates. Sure, a monitor could do 120 Hz at 800x600, but at 1600x1200, you could probably only do 60 Hz.

        • Lojcs@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          1 day ago

          Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.

          That’s why I specified 60hz :)

          I see that you meant TVs specifically but I think it is misleading to call processing delays ‘inherent’ especially since the LG TV you mentioned (which I assume runs at 60hz) is close to the minimum possible latency of 8.3ms.

          • tomkatt@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 day ago

            True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.

            Bear in mind I can’t pinpoint the specific issue for any given game but there are many.

            Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.

            Your GPU has a frame buffer that’s essentially never less than one frame, and often more.

            I mentioned TVs above re: post processing.

            Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.

            Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”

            Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.

            Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.

            • Lojcs@lemm.ee
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              1 day ago

              The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it’s not buffering whole frames before displaying them.

              Your GPU has a frame buffer that’s essentially never less than one frame, and often more.

              And sometimes less, like when vsync is disabled.

              That’s not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That’s 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora’ mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn’t miss vsync