I've been using *nix operating systems since 1992 and trusty old X11 has been a consistent friend throughout all of my adventures in it.
In case you're wondering what the hell I'm on about, Unix follows a slightly different path than Windows when it comes to drawing windows on the screen. Unix was designed with remote computing in mind... You may have a desktop machine running in your lab, but the beefy machine is four floors away, or even in a different country. So when X11 was designed, the "X Server" runs on your machine, and when you want to run something you log into wherever the best machine is and "export your display" so the remote machine can connect to your local machine and run the program. So, you could have a machine running in Edinburgh, but be logged into a machine in Helsinki, and run "xv" (say), an image display app on the machine in Helsinki, but the window appears on your desktop in Edinburgh. You can interact with it, click it, resize it, all completely normally as if it was running locally. Brilliant!
However, you need other things to make this work... You need a "Compositor" which allows you to display multiple windows onscreen at once. Imagine being logged into three computers a long way away and all three are beaming their program to your computer, and you're showing all three windows side-by-side. With X11, this is entirely possible!
Back in the 1990s when computers were still in the mainframe/terminal , server/client model, this seemed like a brilliant idea (and it was).... Fast forward to 2025 and we all have a Cray supercomputer sitting in our graphics cards, and that's not even mentioning the powerful CPU that goes with it.
So, the remote-display functionality of X11 is becoming less and less useful, especially with the rise of protocols like "RDP" and "VNC" where you can just log into a windowing system remotely and see the entire desktop. It's not as useful as X11 when you just want to see a single app, but it does work.
The problem with X11 is that it comes with a lot of overhead to manage the server/client thing. When you're running a graphically intensive application on a machine, and it has to beam all that information to a server to be composited and displayed, there's some unnecessary memory copying... So many graphics drivers attempt to overcome this by finding other ways to display stuff.. Unfortunately, this comes at the price of a wobbly interaction with the poor window compositor that's desperately trying to keep up with the drivers doing unusual things in order to get good frame rates....
So, onto the scene stepped Wayland. I say "onto the scene"... The need for a more streamlined compositor system was recognised about a decade ago. Poor X11's cards have been marked ever since. But, as with all things consumer-related, Nvidia have been shockingly slow to realise it. Their driver performance on X11 was bad, and their support for the prototype Wayland was downright abysmal. Things would hang, multiple monitors wouldn't work, fonts would appear corrupted, and the GPU would suddenly hang the PCI bus for no adequately explained reason (or that's what the errors seem to suggest, I'm no hardware expert).
But in the extremely buggy 580 and the new 590 drivers, Nvidia seem to have started to take Wayland seriously. Many of the bugs seem to have been squashed, and performance is much better. Windows don't jerk around and show undrawn sections, and (touch wood) the desktop hasn't crashed all day.
In addition, games running under Proton seem to be far more resistant to crashing when another window steals focus (like a friend sending a chat popup in Steam).
The jury's still out on whether Nvidia have made a stable "enough" driver - but so far, my experience is that they seem to have made BIG strides, especially on my Optimus laptop which was, frankly, an unstable mess with the Nvidia proprietary drivers.
So, it looks like the day of Wayland may be about to arrive. With Ubuntu shifting over to it as the default, and dropping the ancient X11 system, we may see a new future where Linux really does start to accelerate away from Windows in the FPS race. It's already about 5% faster than windows, even with X11 in my testing... But having tried a few games with Wayland, it looks like there may be yet another leap, perhaps up to 15% to 20% in some cases.
Sadly, Nvidia are still the village dunces with DX12 support. That still sucks ass... So, my verdict only covers DX11 titles. But I'm hoping that Nvidia can fess up what's broken in their drivers and help the Vulkan guys out, because right now Team Red (AMD) is looking the much better choice on Linux for more modern gaming.