Youtube does not reencode/encode on the fly. They encode once during/after upload. Youtube transcodes every uploaded video into the different video profiles they support, starting with the lowest one. That's why you often only get the 240p stream at first when you watch a recently uploaded video. Once this is done the player simply switches between the different 'versions' depending on your current bandwidth.
Encoding quality and efficiency is in this case limited only by available memory and CPU and given that you have the complete video available to you you can do all kinds of nifty predictive encoding and variable bitrate shenanigans using P- and I-frames to give the user the best quality per bitrate. You also don't have to encode in 'real time' because you only need to encode a video once per supported profile. This is the reason why you can stream a 3840x2160 10 bit HDR video @ 60 frames and still get a 'reasonable' bandwidth requirement (still several MBit/s). Even live video streaming services add an encoding delay of several seconds so that their encoders can work more efficiently and use predictive 'look-ahead' encoding options.
Also, since everyone who watches a certain video gets the same data YouTube could theoretically do multicasting to conserve even more bandwidth.
Even with all of those nifty features in place a 1080p YouTube video requires 3 Gigabyte/hour and a 4K 2160p video will use up to 12 GB/h. This is already practically impossible for mobile users given the data caps and overrun fees and would still be a problem for many wired internet connections because of data caps and network congestion issues.
If you have to do low latency real time encoding of video you lose almost all of the features of modern codecs that enable high compression rates while still retaining good image quality, that's because those features need to process upcoming frames in order to compress the current frame more efficiently. If you can't do that - and with low latency real time encoding you can't because buffering and looking ahead is what causes latency in the first place - you'll either end up with a video stream that retains the image quality but has a much lower compression ratio or a stream that retains the compression ratio but has much worse image quality. There's also always the upper limit of 16.6 ms per frame for 60 Hz you need to consider because it sets a bound for the amount of work the encoder can reasonably do before it has to put out a frame and the amount of work that needs to be done per frame depends on the actual contents and is not constant time. Real time codecs are also much more susceptible to noise and the type of video content.
This is the main reason why Twitch usually has an up to 20 second stream delay and also why most digital broadcasting adds 3 - 5 seconds of delay compared to plain old analogue TV. I'd reckon that a significant portion of the 160 ms to 200 ms of latency is actually encoder delay so that they can do at least some video compression.
You can check how this affects your IQ (image quality). Just watch Twitch when a streamer plays a game that has lots of red colors (codecs perfom worst on content that contains lots of red) or where a lot of the on screen contents change quickly. The encoder completely shits itself because the encoding time per frame increases with the rate of change per frame but the output still needs to be processed in 'real time' while keeping the selected bitrate. As the actual encoding time per frame reaches the limit (16.6 ms per frame for 60 Hz) the codec can either reduce the compression rate to keep up or reduce the amount of frame processing done on the video feed while retaining the compression ratio (worse image quality).
Google certainly knows how to run a data center and can handle the bandwidth required for video streaming but they can't magically remove all of the issues that you have if you have 'unpredictable' content with high rates of change on a frame per frame basis that needs to be streamed 'real-time' with low latency.
If they target the same IQ as YouTube video at the same resolution the required bitrate will be significantly higher (50% or more) and if they target the same bitrate as YouTube video they'll end up with a significantly worse image quality. Or latency increases to a point where actually playing a game becomes impossible.