144hz with less than 144fps?
Ive been curious for some time, people that have a 144hz monitor when your fps drops below 144 does it feel laggy like when youre on a 60hz monitor and the frames drop below 60?
- Anonymous4 days ago
I don't think anyone on here has a 144hz monitor. My daughter has a 1440p 144hz monitor and I can see the difference it makes when the screen is in motion. So I can see why a 144hz monitor is worth buying because it takes out that motion blur when things are moving on the screen.
That would depend on what scene the game is in when there is a framerate drop. You might notice it when the screen is in motion or then the action is happening. The action won't be as fluid when the framerate drops to 90fps or something like that. About the most you can do to mitigate this is to have a fast CPU.
- Anonymous1 month ago
If the monitor has Adaptive Sync, or Freesync, or Gsync (all the same things now), then the monitor can adjust in a range between 30 Hz to 144 Hz. So if the graphics card is only going to be able to do 57 Hz for example, it will send a signal to the monitor to adjust to a 57 Hz refresh rate. Or possibly some lower step below the target FPS, like maybe 50 or 55 Hz.
- A.J.Lv 71 month ago
I understand how things work and hardware specifications, but not a gamer at all.
The Hz number of a monitor is a refresh rate in cycles per second. Gaming, and video, is a series of images. Gaming fps (frames per second) is how fast the computer is sending images to the monitor. It's like a machine gun firing bullets and the monitor catches the bullet and shows it. If it has no new bullet, it keeps the one it has.
60Hz is 60 refresh cycles every second. It clears the screen and puts up the next image if there is one, or keeps the same image if none is ready. 144Hz does it at 144 times a second.
Computer graphics card or processor is creating the image and the CPU is doing the instruction overall. There are other factors, but these are top 2 in computer gaming ability. The graphics card has a graphics ram buffer holding the image(s).
The process in gaming is the person (user, gamer) controlling what the images should do, seeing the display and reacting. Human to controller, controller to computer, computer process and send to monitor, monitor processes and displays.
There is this pre-processing time, refresh rate, the GTG(grey to grey) which is the pixel color change rate in milliseconds. These are all tiny delays, but we detect time differences. A monitor with a high refresh rate is also designed for fast pre-processing and fast GTG. It adds up to total input lag and there are some databases.
The specifications tell you the GTG and Hz, and do have other specs of brightness, and measures of black, and backlight cd/m2 and more, but rarely state total input lag.
I gave you a lot of information and hope you understand.
Now, what "laggy" means. That is the time from controller to image change that does what you want it to do. It combines the fps of the computer and the total input lag of the monitor together. It's also in the gamer and pushing buttons and levers and reacting to the screen.
If your game play is outputting 40fps, a 144Hz monitor with faster pre-processing might help a tiny bit, but the display image changes at 40fps, the exact same as a 60Hz monitor. It can't fix that laggy situation.
However, if a game can output at 90fps, the 60Hz monitor throws away 30 of them every second. A 144Hz monitor is smoother image flow, displaying all 90.
Your game can send 144 images a second and the monitor uses them all.
The monitor will give as smooth a flow as you can send it, compared to the 60Hz.
You have a different issue. If the frame rate is jumping between 140 and 70, on a 60Hz monitor you cap it at 60. On a 144Hz, you choose the frame rate varying, or cap it. At that frame rate it isn't lag as much as stability, and you may or may not notice the change of frame rate at that high a level.
So, the laggy you refer to is improved by the 144hz monitor, only because it has a faster preprocessing by a few milliseconds and that counts. And, if gaming at high fps, you have less lag from controller to display. And, you have images as up-to-date as possible. A fast monitor can't fix a weak graphics card.
But, a strong graphics card can be held back by a slow monitor.
If your computer can output 90+fps capping it at 60 seems like a waste.
Or churning out 144fps, and only showing 60 of them?
I hope I explained this well.
- Robert JLv 71 month ago
Your eyes/brain cannot generally notice any difference until it drops below 60Hz.
- What do you think of the answers? You can sign in to give your opinion on the answer.
- SBR32277Lv 71 month ago
I don't have personal experience, but my understanding is that you want your graphics card to be in sync with your monitor so that you don't see tearing. With a 144hz monitor in sync, it just means that you can take full advantage of 144fps or less frame rate, meaning that if your frame rate is only 100fps, you get the full benefit of that 100fps whereas if you had a 60hz monitor, your benefit is capped at 60fps instead of the full 100fps. I doubt that it would feel laggy.