The more frames your monitor displays per second, the more frames it will display in a fixed amount of time that it takes to send a packet... basically higher image frame rates can allow you to have faster human response times, but the computer can't communicate that information any faster to the server because your packet is sitting in a queue waiting for some other packet to be sent... so basically you'll just find things frustrating because you'll see and know what should have happened, but the server won't know about what did happen because it won't get the information in time.
The issue is this... suppose there is nothing to be sent except one full size packet 1500 bytes sent by some upload process on your network... So this packet is sent to the modem. It physically takes 27.3 milliseconds to modulate the packet onto the telephone wire as an electrical signal...
Now immediately after that 1500 byte packet is sent to the modem, you click your mouse to fire in your game... Now the router sees this small, maybe 200byte packet to be sent, but it knows that your modem is going to take 27ms... so it waits 27ms and then sends this packet, which takes another 3.6 ms to be put onto the wire...
So, if there is a packet being sent just before your click... things could take up to 27.3+3.6 ~ 31 ms to get sent out onto the internet... and if there isn't a packet being sent by the modem it could take just 3.6ms to go out onto the internet...
In other words, you should expect your transmission time to vary between 3.6 and 31 ms depending on traffic at the minimum...
now if you don't have any prioritization for your game packets, then perhaps 2 or 3 packets could be in a queue... so you might experience variability of between 3.6 and say 3.6+3*27.3 ~ 85.5ms
So information about your actions would leave your network between round-off 3 and 90 ms after you take the action.
Now, if you have 144Hz display that's 144 images / second, and 90ms = .09 seconds/packet... multiply those together and you get about 13 images/packet... so information leaves your network somewhere between say 1 and 13 images behind when it occurred...
When the lag is consistent, then the server has an easier time compensating for it... but when the lag varies by this much... it will feel "random" as to the connection between what you do, and what the server decides actually happened...
The very best you can get is to never have any queue for your game packets... in which case you will experience delays between 0 and .027 * 144 ~ 4 displayed image frames.... that won't be quite so random, but it'll still be noticeable
Compare the results if you have 100Mbps upload... one 1500 byte packet takes 1500*8/1e8/.001 = 0.12 ms to send... so if you have no queue involved your 200 byte game packet will leave your network between 0.000012 * 144 = .017 images later... which is basically zero... So with 100Mbps upload you experience basically no variability in your transit time caused by your own network (you will still experience issues caused by global/ISP conditions).
A gamer with a gigabit fiber connection and a decent QoS would experience ZERO variability in timing even if their whole household were torrenting and streaming all at once, whereas a person such as yourself experiences a minimum of 4 displayed frames if there's any traffic at all and maybe a full second delay with serious uncontrolled bufferbloat.