Is FPS affecting accuracy in Counter-Strike: Source
It seems that people with good FPS can fire more accurately than people with low FPS. It is obvious that crosshair recovers slower than with high FPS - but does that affect accuracy?


What are we testing?

Effect of FPS to accuracy was tested by shooting at a wall with M4 (without silencer) and recording exact coordinates where bullets hit. This was possible with Mattie Eventsrcripts. 90 bullets were fired with both 25 FPS and over 100 FPS. Another computer was 3100 Sempron (that is slow!) and another one E6300 Core Duo (very fast!!). Expected result was that 100 FPS player would shoot more accurately than 25 FPS player. Server was running with 66 tickrate.
When 90 bullet series were recorded then the data was handled with Linux shell scripts and GNUPlot used to make a graph of the firing pattern. Two tests were done, first firing all bullets standing still to a wall. Another test was to start strafing right and fire all bullets while strafing. Firing and strafing was done so that one keypress started the whole sequence.
Distance to the wall was about 20 meters.

Firing pattern when standing still (25 FPS vs. over 100 FPS)

Firing pattern when moving (25 FPS vs. over 100 FPS (even though it says 80 FPS in the picture, it’s above 100 FPS))
Just by looking at the pictures it looks like there’s no difference between 25 FPS and 100 FPS player. Another thing which was tested was firing sequences of 3 or 4 bullets at a time like normally in firefight. There was no difference then either.


Conclusion

Conclusion: low FPS players should stop complaining! Real conclusion: because of the way how data packets are sent to the server [Read: Rate settings in perspect of FPS] it is possible (and even likely) that high FPS players have actually advantage over low FPS players - other than being able to aim more accurately thanks to high FPS. Both client and server do weird stuff by interpolating data that they receive from eachothers so it’s possible that either the server or client or both are sending and expecting different data than they get. Because of those errors there might be simulation errors at the server and that gives an advantage for high FPS player. One of the errors might be that high FPS players data packets are handled first at the server and thus the high FPS player can get his shots registered before low FPS player.
For more information about how client and server do interpolation please read Source Multiplayer Networking, Lag Compensation and Networking Entities. Easiest to read is the first one and the hardest the last one. Sending and handling data over network is not easy! Tech wizards can continue to article Latency Compensating Methods in Client/Server In-game Protocol Design and Optimization - not for mundane gamers!


Thoughts

As is written on Ghost’s blog about FPS settings affecting to rates it is possible that setting correct rate settings can help player shoot more effectively. Even though maximum updaterate would be 66 or 100 it could be more beneficial for player to set his rates according to his own computer specifications - that is, lower rates.
It would have been also possible to record exact time when the bullets hit the wall. That way it’d been possible to analyze whether 100 FPS player can fire faster bursts or empty the whole magazine faster. But there’s a more technical view to this thing which explains why testing is not easy. When 25 FPS player sends his command packets to server he only sends 25 packets per second. 100 FPS player can send all expected 66 packets. Because 25 FPS player is sending less packets he might instead send larger packets - more information in each data packet.
When server finally receives this information server doesn’t process it all at the same instant. For example if there are two commands such as “fire, move left” in the data segment then first “fire” is executed and then “move left”. In same situation 100 FPS player could send both “fire” and “move left” in separate data segments. When server receives data packets from both 100 FPS and 25 FPS player the server might do miscalculations on who shot first and where. Note: server isn’t that stupid actually, read more in links given above!


Scripts and stuff used to process the data

Mattie Eventscripts script to record bullet impacts
es_hitgroups.txt
Code:
 
event load { es_setinfo hitX 0 es_setinfo hitY 0 es_setinfo hitZ 0 es_setinfo i 0 es_keygroupdelete HitGroups es_keygroupcreate HitGroups est_RegSayHook !reset 0 } event bullet_impact { es_setinfo hitX event_var(x) es_setinfo hitY event_var(y) es_setinfo hitZ event_var(z) es_keycreate HitGroups server_var(i) es_keysetvalue HitGroups server_var(i) x server_var(hitX) es_keysetvalue HitGroups server_var(i) y server_var(hitY) es_keysetvalue HitGroups server_var(i) z server_var(hitZ) es_keygroupsave HitGroups |hitgroups // Debug es_setinfo coord 0 es_format coord “Bullet impact: %1 %2 %3″ server_var(hitX) server_var(hitY) server_var(hitZ) es_msg server_var(coord) es_math i + 1 } event est_sayhook { es_tell #green Reset! es_setinfo i 0 }
Linux shell-script to parse recorded bullet impact data
Code:
 
grep “\”.*\”\|x\|y\|z” 25fps.txt | sed -e ’s/^[^”]*//’ -e ’s/”//g’ -e ’s/[^0-9.-]//g’ -e ‘/^$/d’ | sed -e ‘N;s/\n/ /;N;s/\n/ /;N;s/\n/ /;’
GnuPlot script to plot graphs
Code:
 
reset set title “FPS shooting” set terminal png color set output “firing-pattern.png” set multiplot plot “25fps.txt” using 2:4 with points, “100fps.txt” using 2:4 with points
this was taken from: Setti :: Accuracy and FPS