Why do low framerates *feel* so much worse on modern video games?
from VinesNFluff@pawb.social to nostupidquestions@lemmy.world on 06 Jun 20:18
https://pawb.social/post/25920892

Like I’m not one of THOSE. I know higher = better with framerates.

BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.

The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!

… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.

Yet like.

I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.

And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?

#nostupidquestions

threaded - newest

BombOmOm@lemmy.world on 06 Jun 20:26 next collapse

It’s a few things, but a big one is the framerate jumping around (inconsistent frame time). A consistent 30fps feels better than 30, 50, 30, 60, 45, etc. Many games will have a frame cap feature, which is helpful here. Cap the game off at whatever you can consistently hit in the game that your monitor can display. If you have a 60hz monitor, start with the cap at 60.

Also, many games add motion blur, AI generated frames, TAA, and other things that really just fuck up everything. You can normally turn those off, but you have to know to go do it.


If you are on console, good fucking luck. Developers rarely include such options on console releases.

IceFoxX@lemmy.world on 07 Jun 12:01 collapse

30 50 30 60 30… Thats FPS… Frametime means the time between each frame in this second.

Lembot_0003@lemmy.zip on 06 Jun 20:33 next collapse

Stability of those fps are even more important. Modern games have problems with that.

[deleted] on 07 Jun 07:08 next collapse
.
prole@lemmy.blahaj.zone on 07 Jun 12:27 collapse

Because they all fuck with the frame timing in order to try to make the fps higher (on paper)

SolidShake@lemmy.world on 06 Jun 20:34 next collapse

Bro when Majora’s mask came out nothing was 60fps lol. We weren’t used to it like how we are today. I’m used to 80fps so 60 to me feels like trash sometimes.

VinesNFluff@pawb.social on 06 Jun 20:50 next collapse

Ackshuli – By late 2000 there were a couple games on PC that could get there.

… If you were playing on high-end hardware. Which most PC gamers were not. (despite what Reddit PCMR weirdos will tell you, PC gaming has always been the home for janky hand-built shitboxes that are pushed to their crying limits trying to run games they were never meant to)

Regardless that’s beside the point – The original MM still doesn’t feel bad to go back to (it’s an annual tradition for me, and I alternate which port I play) even though it never changed from its 20FPSy roots.

TheFogan@programming.dev on 06 Jun 20:51 next collapse

Yeah but even now you can go back and play Majora’s mask, and it not feel bad.

But as mentioned the real thing is consistancy, as well as the scale of action, pace of the game etc… Zelda games weren’t sharp pinpoint control games like say a modern FPS. Gameplay was fairly slow. and yeah second factor is simply games that were 20FPS, were made to be a 100% consistant 20 FPS. A game locked in at 20, will feel way smoother than one that alternates between 60 and 45

IceFoxX@lemmy.world on 06 Jun 21:02 collapse

No more optimizations. This must then be compensated for with computing power, i.e. by the end user. These are cost reasons. Apart from that, the scope has become much larger, making optimizations more time-consuming and therefore more expensive. In the case of consoles, there is also the fact that optimizations have to be made specifically for a hardware configuration and not, as with PCs, where the range of available components is continuously increasing. Nevertheless, the aim is to cut costs while maximizing profits.

otp@sh.itjust.works on 06 Jun 21:04 next collapse

Bro when Majora’s mask came out nothing was 60fps lol

Huh? 60fps was the standard, at least in Japan and North America, because TVs were at 60Hz/fps.

Actually, 60.0988fps according to speed runners.

bleistift2@sopuli.xyz on 06 Jun 21:18 next collapse

The TV might refresh the screen 60 times per second (or actually refresh half the screen 60 times per second, or actually 50 times per second in Europe), but that’s irrelevant if the game only throws 20 new frames per second at the TV. The effective refresh rate will still be 20Hz.

That’s just a possible explanation. I don’t know what the refresh rate of Majora’s Mask was.

otp@sh.itjust.works on 06 Jun 21:23 collapse

I’m pretty sure the 16-bit era were generally 60FPS

VinesNFluff@pawb.social on 06 Jun 23:00 collapse

Framerates weren’t really a

Thing.

Before consoles had frame-buffers – Because Framebuffers are what allow the machine to build a frame of animation over several VBlank Intervals before presenting to the viewer.

The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.

Before that, you were in beam-racing town.

If your processing wasn’t enough to keep up with the TV’s refresh rate (60i/30p in NTSC territories, 50i/25p in PAL) – Things didn’t get stuttery or drop frames like modern games. They’d either literally run in slow-motion, or not display stuff (often both, as anyone who’s ever played a Shmup on NES can tell you)

You had the brief window of the HBlank and VBlank intervals of the television to calc stuff and get the next frame ready.

Buuuut, as of the PSX/N64/Saturn, most games were running anywhere between 15 and 60 FPS, with most sitting at the 20s.

PC is a whole different beast, as usual.

_NetNomad@fedia.io on 06 Jun 23:24 next collapse

i think you're mixing up a few different things here. beam-racing was really only a thing with the 2600 and stopped once consoles had VRAM, which is essentially a frame-buffer. but even then many games would build the frame in a buffer in regular RAM and then copy everything into VRAM at the vblank. in other cases you had two frames in VRAM and would just swap between them with a pointer every other frams. if it took longer than one frame to build the image, you could write your interrupt handler to just skip every other or every three vblank interrupts, which is how a game like super hang-on on the megadrive runs at 15 FPS even though the VDP is chucking out 60 frames a second. you could also disable interrupts when the buffer was still being filled, which is how you end up with slowdown on certain games when too many objects were on the screen. too many objects could also lead to going over the limits of how many sprites you can have on a scanline, which is why things would vanish- bit that is it's own seperate issue. if you don't touch VRAM between interrupts then the image shown last frame will show this frame as well

otp@sh.itjust.works on 08 Jun 00:40 collapse

The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.

I cared about the 3DO…

Thanks for the info though!

MichaelScotch@lemmy.world on 06 Jun 21:31 next collapse

FPS and alternating current frequency are not at all the same thing

otp@sh.itjust.works on 06 Jun 22:20 collapse

I was looking it up, and games like Super Mario World are allegedly at 60fps according to some random things on the internet

aubeynarf@lemmynsfw.com on 07 Jun 01:11 collapse

Because CRTs (and maybe other displays) are slaved to the input and insensitive to exact timing, and console chipset designers used convenient line counts/clock frequencies, consoles often have frame rates slightly different from standard NTSC (which is 60000/1001 or ~59.94 fields per second).

The 262 AND A HALF lines per field NTSC uses to get the dumb oscillator in a CRT to produce interlacing, is not convenient. “240p” moves the VSYNC pulse, shortening the frame duration.

So NES’s run at -60.1 FPS.

VinesNFluff@pawb.social on 06 Jun 23:01 collapse

Wrong but also not completely wrong.

tomkatt@lemmy.world on 06 Jun 23:29 collapse

F-Zero X ran at 60 fps. Also Yoshi’ Story, Mischief Makers, and probably a few others.

Also the PS1 has many games that ran at 60 fps, too many to list here in a comment.

over_clox@lemmy.world on 06 Jun 20:36 next collapse

My favorite game of all time is Descent, PC version to be specific, I didn’t have a PlayStation when I first played it.

The first time I tried it, I had a 386sx 20MHz, and Descent, with the graphics configured at absolute lowest size and quality, would run at a whopping 3 frames per second!

I knew it was basically unplayable on my home PC, but did that stop me? Fuck no, I took the 3 floppy disks installer to school and installed it on their 486dx 66MHz computers!

I knew it would just be a matter of time before I got a chance to upgrade my own computer at home.

I still enjoy playing the game even to this day, and have even successfully cross compiled the source code to run natively on Linux.

But yeah I feel you on a variety of levels regarding the framerate thing. Descent at 3 frames per second is absolutely unplayable, but 20 frames per second is acceptable. But in the world of Descent, especially with modern upgraded ports, the more frames the better 👍

VinesNFluff@pawb.social on 06 Jun 20:53 next collapse

Descent is pretty fun. Not as big of a fan as you are, but I definitely dig it.

Kvoth@lemmy.world on 06 Jun 21:05 next collapse

Great games. Free space was almost mind blowing when I first played it as well

over_clox@lemmy.world on 06 Jun 21:39 collapse

I haven’t actually played Free Space before, but I did manage to get a copy and archive it a few years ago.

I also got a copy of Overload and briefly tried that, but on my current hardware it only runs at about 3 frames per second…

The Descent developers were really ahead of their time and pushing gaming to the extreme!

Kvoth@lemmy.world on 07 Jun 06:58 collapse

Definitely give it a shot. It’s obviously different, but I loved it. My mom actually banned me from playing descent 3: vertigo, because she had vertigo and it made her sick

over_clox@lemmy.world on 07 Jun 07:09 collapse

Vertigo was actually an expansion on Descent 2, I made the NoCD patch for it via a carefully hex edited mod based on another NoCD patch for the original Descent 2.

Any which way, yeah, anyone with vertigo wouldn’t be comfortable or oriented in any way if they’re watching or playing the game, no matter what version.

Kvoth@lemmy.world on 08 Jun 01:29 collapse

Shit you’re right. It’s been too long

madjo@feddit.nl on 06 Jun 23:33 collapse

Descent broke my brain. I’m pretty good at navigating in FPS’, but Descent’s 4 axis of movement just didn’t click for me. I kept getting lost, recently I tried it again after many years, I just can’t wrap my head around it.

Same with space sims. I’m dog awful in dog fights.

over_clox@lemmy.world on 07 Jun 01:37 collapse

Indeed, it’s not quite a game for everyone, especially if you’re prone to motion sickness. Initially it only took me about a half hour to get a feel for the game, but configuring the controls can still be a headache.

Every time I set the game up on a new or different system, I tend to usually go for loosely the same sort of controls, but each new setup I might change up the controls a bit, like an endless guessing and testing game to see what controls might be ideal, at least for me.

By the way, Descent is considered a 6 Degrees Of Freedom game, not 4. But hey, at least they have a map feature, I’d go insane without the map sometimes…

madjo@feddit.nl on 07 Jun 09:54 collapse

I meant 6, not sure why I typed 4.

RobotZap10000@feddit.nl on 06 Jun 20:59 next collapse

FPS counters in games usually display an average across multiple frames. That makes the number actually legible if the FPS fluctuates, but if it fluctuates really hard on a frame-by-frame, it might seem inaccurate. If I have a few frames here that were outputted at 20 FPS, and a few there that were at 70 FPS, the average of those would be 45 FPS. However, you could still very much tell that the framerate was either very low or very high, which would be perceived as stutter. Your aforementioned old games probably were frame-capped to 20, while still having lots of processing headroom to spare for more intensive scenes.

tomkatt@lemmy.world on 06 Jun 21:18 next collapse

Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.

Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).

Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.

Lojcs@lemm.ee on 06 Jun 22:44 collapse

First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that’s a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.

Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts

Edit: to be clear, this is the screen’s refresh rate, the game doesn’t need to run at hfr to benefit.

tomkatt@lemmy.world on 06 Jun 23:05 collapse

I don’t understand all the technicals myself but it has to do with the way every pixel in an OLED is individually self-lit. Pixel transitions can be essentially instant, but due to the lack of any ghosting whatsoever, it can make low frame motion look very stilted.

Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts

That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.

The LCD latency has to do with input polling and timing based on display latency and polling rates. Also, there’s added latency from things like wireless controllers as well.

The actual frame rate of the game isn’t necessarily relevant, as if you have a game at 60 Hz in a 120 Hz display and enable black frame insertion, you will have reduced input latency at 60 fps due to doubling the refresh rate on the display, increasing polling rate as it’s tied to frame timing. Black frame insertion or frame doubling doubles the frame, cutting input delay roughly in half (not quite that because of overhead, but hopefully you get the idea).

This is why, for example, the Steam deck OLED has lower input latency than the original Steam Deck. It can run up to 90Hz instead of 60, and even at lowered Hz has reduced input latency.

Also, regarding LCD, I was more referring to TVs since we’re talking about old games (I assumed consoles). Modern TVs have a lot of post process compared to monitors, and in a lot of cases there’s gonna be some delay because it’s not always possible to turn it all off. Lowest latency TVs I know are LG as low as 8 or 9ms, while Sony tends to be awful and between 20 and 40 ms even in “game mode” with processing disabled.

Lojcs@lemm.ee on 06 Jun 23:25 next collapse

Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.

That’s why I specified 60hz :)

I see that you meant TVs specifically but I think it is misleading to call processing delays ‘inherent’ especially since the LG TV you mentioned (which I assume runs at 60hz) is close to the minimum possible latency of 8.3ms.

tomkatt@lemmy.world on 06 Jun 23:40 collapse

True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.

Bear in mind I can’t pinpoint the specific issue for any given game but there are many.

Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.

Your GPU has a frame buffer that’s essentially never less than one frame, and often more.

I mentioned TVs above re: post processing.

Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.

Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”

Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.

Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.

Lojcs@lemm.ee on 07 Jun 00:31 collapse

The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it’s not buffering whole frames before displaying them.

Your GPU has a frame buffer that’s essentially never less than one frame, and often more.

And sometimes less, like when vsync is disabled.

That’s not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That’s 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora’ mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn’t miss vsync

moody@lemmings.world on 07 Jun 01:29 collapse

That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.

Essentially, the speed of the beam determined how many lines you could display, and the more lines you tried to display, the slower the screen was able to refresh. So higher resolutions would have lower max refresh rates. Sure, a monitor could do 120 Hz at 800x600, but at 1600x1200, you could probably only do 60 Hz.

lurch@sh.itjust.works on 06 Jun 21:49 next collapse

old games animations were sometimes made frame by frame. like the guy who drew the character pixel by pixel was like “and in the next frame of this attack the sword will be here”

Ghostalmedia@lemmy.world on 06 Jun 22:41 next collapse

Some old games are still pretty rough with their original frame rate. I recently played 4 player golden eye on an n64, and that frame rate was pretty tough to deal with. I had to retrain my brain to process that.

Crozekiel@lemmy.zip on 06 Jun 23:55 collapse

Out of curiosity, did you have an actual N64 hooked up to a modern TV? A lot of those old games meant to be played on a CRT will look like absolute dog shit on a modern LCD panel. Text is harder to read, it is harder to tell what a shape is supposed to be, it’s really bad.

Ghostalmedia@lemmy.world on 07 Jun 00:24 collapse

Trinatron baby

palordrolap@fedia.io on 06 Jun 23:15 next collapse

Two things that haven't quite been mentioned yet:

1) Real life has effectively infinite FPS, so you might expect that the closer a game is to reality, the higher your brain wants the FPS to be in order for it to make sense. This might not be true for everyone, but I imagine it could be for some people.

More likely: 2) If you've played other things at high FPS you might be used to it on a computer screen, so when something is below that performance, it just doesn't look right.

These might not be entirely accurate on their own and factors of these and other things mentioned elsewhere might be at play.

Source: Kind of an inversion of the above: I can't focus properly if games are set higher than 30FPS; It feels like my eyes are being torn in different directions. But then, the games I play are old or deliberately blocky, so they're not particularly "real" looking, and I don't have much trouble with real life's "infinite" FPS.

Crozekiel@lemmy.zip on 06 Jun 23:53 next collapse

Part of it is about how close you are to the target FPS. They likely made the old N64 games to run somewhere around 24 FPS since that was an extremely common “frame rate” for CRT TVs common at the time. Therefore, the animations of, well, basically everything that moves in the game can be tuned to that frame rate. It would probably look like jank crap if they made the animations have 120 frames for 1 second of animation, but they didn’t.

On to Fallout 4… Oh boy. Bethesda jank. Creation engine game speed is tied to frame rate. They had several problems with the launch of Fallout76 because if you had a really powerful computer and unlocked your frame rate, you would be moving 2-3 times faster than you should have been. It’s a funny little thing to do in a single-player game, but absolutely devastating in a multi-player game. So, if your machine is chugging a bit and the frame rate slows down, it isn’t just your visual rate of new images appearing that is slowing down, it’s the speed at which the entire game does stuff that slowed down. It feels bad.

And also, as others have said, frame time, dropped frames, and how stable the frame rate is makes a huge difference too in how it “feels”.

LunarLoony@lemmy.sdf.org on 07 Jun 17:24 collapse

I have never come across a CRT whose native “frame rate” was 24

VinesNFluff@pawb.social on 07 Jun 21:34 next collapse

Yeah it’s actually around the 30s (or 60s, depending on whether you consider interlaced frames to be ‘true’ or just ‘halves’)

A CRT television runs at 60Hz because it uses the alternating current from the wall as a reference, but in every half cycle it only actually draws half of the image. “60i” as they call it.

So you can say it’s 60 interlaced frames a second, which is about comparable to 30 progressive frames.

Crozekiel@lemmy.zip on 09 Jun 16:27 collapse

That’s true, but I didn’t say that was the native “frame rate” of the TVs, just a very commonly used frame rate at the time. And, at least in my experience, desync’ed frame rate and refresh rate is less of a problem on CRTs than LCDs - you don’t “feel” it as much generally.

Baggie@lemmy.zip on 07 Jun 01:45 next collapse

It depends on what you’re running, but often if the frame rate is rock solid and consistent it helps it feel a lot less stuttery. Fallout games are not known for their stability and well functioning unfortunately.

For comparison, deltarune came out a few days ago, that’s locked to 30 fps. Sure it’s not a full 3D game or anything, but there’s a lot of complex motion in the battles and it’s not an issue at all. Compared to something like bloodborne or the recent Zeldas, even after getting used to the frame rate they feel awful because they’re stuttering all the damn time.

FreedomAdvocate@lemmy.net.au on 07 Jun 01:57 next collapse

If you’re not playing on a VRR display 45fps *will * be horrible and stuttery. 30fps locked would feel significantly better.

raltoid@lemmy.world on 07 Jun 05:58 next collapse

Stuttering, but mostly it’s the FPS changing.

Lock the FPS to below the lowest point where it lags, and suddenly it wont feel as bad since it’s consistent.


EDIT: I completley skipped over that you used Fallout 4 as an example. That engine tied game speed and physics to fps last time I played. So unless you mod the game, things will literally move “slower” as the fps drops.

overload@sopuli.xyz on 07 Jun 06:30 next collapse

I think player expectations play a big role here. It’s because you grew up with 20fps on ocarina of time that you accept how it looks.

I’m pretty sure that game is not a locked 20 FPS and can jump around a bit between 15-20, so the argument that it is locked 20 and so feels smooth doesn’t really convince me.

If that game came out today as an indie game it would be getting trashed for its performance.

VinesNFluff@pawb.social on 07 Jun 21:32 collapse

Funny

I played a lot of Lunistice some time back. It’s a retro 3D platformer that has an option to cap the framerate at 20 for a “more authentic retro feel”. Fun lil’ game, even if I eventually uncapped the framerate because it’s also a high-speed and precision platformer and doing that at 20FPS is dizzying.

And yes absolutely Zelda 64 chokes on its 20 frames from time to time. I played it enough (again, yearly tradition, which started when I first finished the duology in the mid-aughts) to know that.

But it wouldn’t change the fact that its absolute maximum is 20 and it still doesn’t feel bad to play.

overload@sopuli.xyz on 08 Jun 00:35 collapse

Haha that’s an interesting 20fps cap option.

I want to give an example of Final Fantasy VII for the PS1. The battles in that game have very low frame rate, about 18 FPS. I modded the game on steam a couple of years ago and unlocked the frame rate, so it was running at 60fps.

I remember it was transformative to the point where it was unsettling to look at, because I had become so accustomed to 18 FPS for that game.

Absolutely after a few battles I preferred it, but it did strike me that some aspect of the games’ identity was tied to that low FPS. Nostalgia is a powerful thing for me.

nanook@friendica.eskimo.com on 07 Jun 07:45 next collapse
@VinesNFluff Most people can't honestly perceive any change in their visual field in less than 1/60th of a second except perhaps at the very periphery (for some reason rods are faster than cones and there are more rods in your peripheral vision) and even then not in much detail. So honestly, frame rates above 60 fps don't really buy you anything except bragging rights.
FlembleFabber@sh.itjust.works on 07 Jun 08:09 next collapse

This is such bullshit, 60 and 90 fps even is such a noticable difference. Maybe 144 and 165 is hard to distinct though…

nanook@friendica.eskimo.com on 07 Jun 08:46 collapse
@FlembleFabber Do you have LED lights in your house? Can you see 60Hz flicker?
binom@lemmy.world on 07 Jun 09:19 collapse

led lights almost always run off rectified, smoothed supplies, so no flicker. i have one cheap old crappy bulb that doesn’t, and i can definitely perceive the flicker. it’s almost nauseating.

nanook@friendica.eskimo.com on 07 Jun 09:28 collapse
@binom If you film with a camera with a ntsc vertical reference rate of 59.95 hz you will see a beat note between the lights and the led lighting indicating it is not well filtered if at all. If you have a newer HiDef camera, most of them work at a 24Hz refresh rate, that IS a slow enough rate that you see jitter in the movement, they also will have a beat note if recording under most LED lights. Many cheap led lights just have a capacitive current limiter and that's it. If you power them off of 50Hz you will see the flicker, if you get dimmable LED lights they will NOT have a filter. But I don't want to interfere with anyone's bragging rights.
VinesNFluff@pawb.social on 07 Jun 21:29 collapse

I wouldn’t know. Could never afford a posh high-refresh-rate monitor, and it’s very low on my priority list, so even if/when I do get a windfall of money, it probably won’t be what I buy next.

Yermaw@lemm.ee on 07 Jun 08:15 next collapse

The display being at a higher resolution doesnt help either. Running retro games on my fancy flatscreen hi-def massive TV makes them look and feel so much worse than on the smaller fuzzy CRT screens of the time.

I can’t stand modern games with lower frame rates. I had to give up on Avowed and a few other late titles on the series S because it makes me feel sick when turning the camera. I assume most of the later titles on xbox will be doing this as theyre starting to push what the systems are capable of and the series S can’t really cope as well.

Toes@ani.social on 07 Jun 13:47 collapse

Are you using an OLED screen?

I had to tinker with mine a fair bit before my PS1 looked good on it.

Ironfacebuster@lemmy.world on 07 Jun 12:14 next collapse

I think it has something to do with frame times.

I’m not an expert but I feel like I remember hearing that low framerate high frametime feels worse than low framerate low frametime. Something to do with the delay it takes to actually display the low framerate?

As some of the other comments mentioned, it’s probably also the framerate dropping in general too.

AdrianTheFrog@lemmy.world on 07 Jun 14:23 next collapse

Game design is a big part of this too. Particularly first person or other fine camera control feels very bad when mouse movement is lagging.

I agree with what the other commenters are saying too, if it feels awful at 45 fps your 0.1% low frame rate is probably like 10 fps

dukeofdummies@lemmy.world on 07 Jun 16:55 next collapse

Well a good chunk of it is that older games only had ONE way they were played. ONE framerate, ONE resolution. They optimized for that.

Nowadays they plan for 60, 120, and if you have less too bad. Upgrade for better results.

dontbelievethis@sh.itjust.works on 07 Jun 18:23 collapse

Probably consistency.

Zelda was 20 fps, but it was 20 fps all the time so your brain adjusted to it. You could get lost in the world and story and forget you were playing a game.

Modern games fluctuate so much that it pulls you right out.