5 years ago
Calm down first if you feel that the title above is just bullshit and assume that how good someone is depends on the skill and reflex of the player. Because of the research released by PCGAMER and conducted by Nvidia, it turns out that there is a tendency that players who use gaming monitors have a K / D ratio (Kill / Death) is better than those who use a normal PC monitor. This of course also proves that the framerates and refresh rates are higher which are carried by the gaming monitor to have an influence on the user.
This data is collected byNvidia GeForce Experience, where millions of samples from gamers around the world are collected, and analyze it to see the tendency of the influence of gaming hardware to be used on the performance released by the players. Nvidia itself focuses their analysis on the two busiest battle royale games - PUBG and Fortnite. Maybe later Nvidia will develop their analysis towards other games.
This data is collected by GeForce Experience, where millions of samples from gamers around the world are collected, and analyze it to see the tendency of the influence of gaming hardware to be used on the performance released by the players. Nvidia itself focuses their analysis on the two busiest battle royale games - PUBG and Fortnite. Maybe later Nvidia will develop their analysis towards other games.
Then how does analyze it? As stated earlier, they looked at the K / D ratio of the players and matched the number of hours played in one week, which was finally broken back into the graphics hardware and monitor refresh rate. They also limit their analysis to gamers with only 1080p monitor resolution. Previously it should be borne in mind that this data is not issued by third parties but by Nvidia itself, so there may be a tendency that this info is indeed used to show the superiority of their products.
You can see the data above which shows an increase in the K / D ratio of the players along with the increase in the hardware they use. They themselves take the range from the GTX 1050 / Ti series which might be the most mainstream series used including to RTX 20xx. But there is a slight difference that is shown through the difference in the refresh of the monitor used. When the increase is not significant in the monitor with a 60 Hz refresh rate, there is a big difference of around 44% -51% when using a 144 Hz monitor, and even higher increases with a 240Hz monitor.
It should be understood that here the monitor device only affects, but it does not become a benchmark that by using expensive gaming monitors you can immediately be good at it. But there is logic that can indeed be processed from this data, humans may be able to distinguish 30 and 60 FPS from the smoothness of the game that runs, but of course it is difficult for the above frame like 120 or even 240. Even so you can still feel the difference, especially supported by monitor that has a high refresh rate too.
The explanation for this problem itself is the speed of input processing and the speed of the monitor in displaying it. Simply put, you will not have a delay when inputting motion when moving, aiming and shooting using the keyboard and mouse. But the difference is when the input is processed by the game engine, and your PC, especially the graphics card, must process the input you have entered by showing it in the visual which ultimately depends also on how fast your monitor can display it.
So here it is concluded that gamers with PC high specifications and gaming monitors with high refresh rates have the advantage of the minimum delay displayed in front of them rather than with players with mediocre specs. So when you feel that you have a hard time getting killed and often die in shooter games or something else, maybe it's not because yu are a noob but just being held back by your PC and potato screen.