 
            As far as I can tell WineD3D only ever calls wglQueryCurrentRenderer* functions and not the indexed ones ever, so it doesn't seem useful to implement the indexed renderer queries.
Yes, that’s true, and I originally wanted to implement only wglQueryCurrentRenderer*, but I wasn’t sure if it was the right way to implement just a part of extension functions and whether it would be accepted, so I decided to implement it completely.
If this is acceptable, then I can remake MR, it could greatly simplify the code.
Also, one question is how we can match that the current renderer is the default (device index 0) one. There doesn't seem to be any guarantee that this is the case?
I’m not sure if there is any consensus about this, as far as I know the NVIDIA driver accurately creates a default display always for zero-indexed device (https://github.com/NVIDIA/egl-gbm/blob/main/src/gbm-display.c#L206, https://github.com/NVIDIA/egl-gbm/blob/main/src/gbm-display.c#L139-L143). I’m not sure about Mesa, because their EGL code is quite confusing, but I understand that they also expect to use the zero-indexed device as default and this is confirmed in practice (https://gitlab.freedesktop.org/mesa/mesa/-/blob/main/src/egl/main/egldevice....).
Anyway, it's valid concern, so we can query the default display device through `eglQueryDisplayAttribEXT` with no need to iterate over all devices.