It's not a myth and it's not strictly related to double buffering, it's just often misunderstood by people like you.Input lag has nothing to do with the myth of the "30 fps" that Double Buffering causes which does not exist or happen anymore cause its not 2005.
If your screen starts a refresh with vsync active and it doesn't have a new frame in the buffer/s it'll refresh the last finished frame again. Because Vsync forces it to completely refresh each frame - it can't swap halfway through and tear - the next opportunity to refresh a new frame is on the next refresh, so after two intervals of 1/60'th of a second instead of one.
Vsync doesn't care how many frames you can put out in a whole second, it only cares if the next frame is ready in a buffer or not. If it is you'll get a ton of input lag (depending on the refresh rate and the depth of the buffer) from buffering the frames, if it's not you'll get stuttering as the last frame is refreshed twice and the next new one has to wait 1/30'th of a second instead of 1/60'th on a 60hz monitor.
Any static refresh rate vsync method runs into this problem, even an ideally setup triple buffered system can't absorb frametime spikes and performance dips for even a fraction of a second very well. Deeper buffers mean more lag and more resiliency to performance spikes and dips but they still fail hard and fail often if you're not very careful; it's a large part of why the standard method is Vsync-off until using an adaptive refresh monitor.
Most of the rest of the reason is the added input lag which is huge with a 60hz vsync - the methods that are only kinda trash instead of completely trash at dealing with variable performance add even more lag.
This is what happens to input lag when you maintain vsync with a buffer on a 60hz display:
https://www.blurbusters.com/wp-conte...limit-60Hz.png
Not fun.