Signed-off-by: Joel Holdsworth <joel(a)airwebreathe.org.uk>
--
v2: ntdll: Add support for FILE_{RENAME,LINK}_POSIX_SEMANTICS.
ntdll: Factor out get_inode_open_sharing.
ntdll/test: Add tests for FILE_LINK_POSIX_SEMANTICS.
ntdll/test: Add tests for FILE_RENAME_POSIX_SEMANTICS.
https://gitlab.winehq.org/wine/wine/-/merge_requests/4457
Adds the tray icons implementation based on org.kde.StatusNotifierItem interface usage. Does allow restarting StatusNotifierWatcher object, but will fallback to XEMBED or internal tray, if wine gets initialized when there is no StatusNotifierWatcher object registered.
--
v41: win32u: Handle dbus notification balloons from system_tray_call
win32u: Handle notification balloons through org.freedesktop.Notifications dbus interface
win32u: Add a ShowBalloon driver interface
win32u: Handle StatusNotifierItem management from system_tray_call
win32u: Handle StatusNotifierWatcher owner changing and registering objects to a new watcher
win32u: Add SNI driver for systray handling
win32u: Add DBus event loop for SNI handling
win32u: Add a SystrayRunLoop driver interface
https://gitlab.winehq.org/wine/wine/-/merge_requests/2808
General idea of this test is to show that a lot of factors influence the fog when transformed and untransformed vertex format was used:
- Directly changing z and w (which is not equal in the corners).
- Changing projection matrix that change z and w.
- Using different programmable/ff vs/ps shader.
- Using different depth bias.
- Chaging depth in pixel shader (oDepth) may or may not affect to colors depending on the vendor implementation.
- And various combinations of above.
This gives `succ` in all cases on this configs:
- Windows 10 with Radeon HD 8400 or Ivy Bridge GT1 (Intel HD Graphics).
- Windows 7 with Radeon HD 6450.
- Windows XP with GeForce Go 7300.
--
v2: d3d9/tests: test table fog z vs rhw with shaders, depth bias, oDepth..
https://gitlab.winehq.org/wine/wine/-/merge_requests/2657
That seems to fix some (difficult to reproduce) crashes in EA Desktop / CEF, usually on shutdown but sometimes during starting up.
Currently TpSetWait can set (or clear) the event, while waitqueue_thread_proc() gets woken from NtWaitForMultipleObjects by previously set wait object and call the callback as if new set (or cleared) wait object is signaled. The crashes I was reproducing were always happening when RtlDeregisterWaitEx call is racing with waking the wait and calling the callback.
--
v2: ntdll: Make sure wakeups from already unset events are ignored in waitqueue_thread_proc().
https://gitlab.winehq.org/wine/wine/-/merge_requests/5044
This is the first batch of a series implementing faster media source resolution required to workaround an Unreal Engine race condition present in some games, and deterministic stream ordering that decodebin / parsebin cannot provide, which is required to expose the streams in native order, for compatibility in several other applications.
I pushed the full series as a branch here: https://gitlab.winehq.org/rbernon/wine/-/commits/mr/wg-source-part-one
Note that this full series is also a first step in the direction of having a simpler demuxer interface, which will be required in the future for compatibility with applications that build MF or DirectShow pipelines directly and expect the relevant components to behave as a demuxer and expose compressed media types. For now it only delays the use of wg_parser to whenever the media source is started, and matches the non-ordered streams using their media types and tags. This is a best effort solution but I don't think we can do much better for the moment.
--
https://gitlab.winehq.org/wine/wine/-/merge_requests/3606
--
v2: winegstreamer: Expose the generic video decoder transform.
winegstreamer: Introduce a generic audio decoder transform.
winegstreamer: Rename aac_decoder to audio_decoder.
winegstreamer: Translate generic unknown audio / video media types.
winegstreamer: Support generic audio / video unknown formats.
winegstreamer: Call gst_video_info_from_caps for all video formats.
winegstreamer: Call gst_audio_info_from_caps for all audio formats.
https://gitlab.winehq.org/wine/wine/-/merge_requests/5138
As mentioned in https://gitlab.winehq.org/wine/wine/-/merge_requests/5264, the next step for OpenGL in the Wayland driver is support for WGL_ARB_pixel_format. It seems possible, at least in theory, to move a large part of the logic involved in this extension outside the drivers for common use by all of them (or ones that want to opt-in).
The goal of this RFC MR is, through experimentation and discussion/feedback, evaluate:
1. Whether making such functionality available outside the drivers (likely in an opt-in manner to begin with) is a productive way forward, or the complexity and platform specific decisions favor the current per-driver approach.
2. If we think that (1) is a worthy goal, what's the best mechanism to achieve it.
Note that the focus of this MR is currently on being a proof-of-concept, rather providing ready-for-detailed-review code (although I have done my best to keep the code decent). This MR currently (roughly in commit order):
* Introduces a new wgl driver callback to allow drivers to provide to opengl32/unix a list of formats and many details about them.
* Uses the information in that list to implement wglGetPixelFormatAttrib*.
* Uses the information in that list plus format sorting rules from WineX11 (effectively GLX rules plus tweaks) to implement wglChoosePixelFormatARB.
* Implements the get_pixel_formats callback for the Wayland driver.
* Hacks the get_pixel_formats callback for WineX11, and to allow me to run some experiments to compare the output of native WineX11 and get_pixel_formats-WineX11. In the admittedly not too many games I tried the sort order is the same, so at least that's encouraging.
My thoughts and notes so far:
* Using this approach for wglGetPixelFormatAttrib* works well, and we can also implement wglDescribePixelFormat in this way.
* It's not at all clear what the "right" sorting rules are for wglChoosePixelFormatARB.
* WineX11 uses GLX + tweaks (e.g., changes depth sorting). This means that larger formats tend to be preferred, at least according to the GLX spec. For example, asking for a r5, g6, b5 in the attributes is supposed to give back rgb888 (or even higher if available) as the top format in the list. Interestingly, and to confuse things even more, I haven't been able to make GLX return non-888(8) configs at all to actually test this more, so perhaps that's what saves it here? However, eglChooseConfig works similarly and there I was able to verify this behavior (e.g., got a nice surprise 10-bit format when asking for 5551 :)).
* Winemac has its own custom logic.
* Mesa's WGL implementation uses a different approach, closer to Wine's normal (wgl)ChoosePixelFormat, where proximity to the target format is strongly rewarded (so it seems asking for r5g6b5 is much more likely to actually get you that).
* Of course, the "gold standard" here would be to try to infer and use the rules used by some windows driver.
Looking forward to thoughts/feedback!
--
https://gitlab.winehq.org/wine/wine/-/merge_requests/5388