Hi all,
I was running into some odd behaviour while debugging—specifically, I found that specifying the d3d channel in WINEDEBUG was cutting my performance in half, even when disabling logging, and even when it should have been functionally identical to a run without the d3d channel specified.
I was able to "fix" this inconsistency with this diff:
diff --git a/dlls/ntdll/thread.c b/dlls/ntdll/thread.c index 1bd8f900d22..f0e4b824918 100644 --- a/dlls/ntdll/thread.c +++ b/dlls/ntdll/thread.c @@ -103,7 +103,11 @@ unsigned char __cdecl __wine_dbg_get_channel_flags( struct __wine_debug_channel { pos = (min + max) / 2; res = strcmp( channel->name, debug_options[pos].name ); - if (!res) return debug_options[pos].flags; + if (!res) + { + if (channel->flags & (1 << __WINE_DBCL_INIT)) channel->flags = debug_options[pos].flags; + return debug_options[pos].flags; + } if (res < 0) max = pos - 1; else min = pos + 1; }
Which makes sense to some degree, but leaves three questions unanswered, which I was hoping someone might be able to cluebat me on:
(1) Why does setting channel->flags affect performance at all? I'm thoroughly baffled by this one.
(2) Why was this diff not applied in the first place? (Simple oversight?)
(3) Why does __WINE_DBCL_INIT even exist, if its only purpose is to guard out a single assignment?
--Zeb
On 2/10/23 01:54, Zebediah Figura wrote:
Hi all,
I was running into some odd behaviour while debugging—specifically, I found that specifying the d3d channel in WINEDEBUG was cutting my performance in half, even when disabling logging, and even when it should have been functionally identical to a run without the d3d channel specified.
I was able to "fix" this inconsistency with this diff:
diff --git a/dlls/ntdll/thread.c b/dlls/ntdll/thread.c index 1bd8f900d22..f0e4b824918 100644 --- a/dlls/ntdll/thread.c +++ b/dlls/ntdll/thread.c @@ -103,7 +103,11 @@ unsigned char __cdecl __wine_dbg_get_channel_flags( struct __wine_debug_channel { pos = (min + max) / 2; res = strcmp( channel->name, debug_options[pos].name ); - if (!res) return debug_options[pos].flags; + if (!res) + { + if (channel->flags & (1 << __WINE_DBCL_INIT)) channel->flags = debug_options[pos].flags; + return debug_options[pos].flags; + } if (res < 0) max = pos - 1; else min = pos + 1; }
Which makes sense to some degree, but leaves three questions unanswered, which I was hoping someone might be able to cluebat me on:
(1) Why does setting channel->flags affect performance at all? I'm thoroughly baffled by this one.
(2) Why was this diff not applied in the first place? (Simple oversight?)
(3) Why does __WINE_DBCL_INIT even exist, if its only purpose is to guard out a single assignment?
--Zeb
Probably to allow changing the selected channels dynamically, through process monitor for instance.
On 2/10/23 00:18, Rémi Bernon wrote:
(1) Why does setting channel->flags affect performance at all? I'm thoroughly baffled by this one.
(2) Why was this diff not applied in the first place? (Simple oversight?)
(3) Why does __WINE_DBCL_INIT even exist, if its only purpose is to guard out a single assignment?
--Zeb
Probably to allow changing the selected channels dynamically, through process monitor for instance.
I'm not sure which question this answers, but I don't think I understand it? winedbg and taskmgr just set the flags directly; they don't touch __WINE_DBCL_INIT, and I don't see why they'd need to.
On 2/10/23 19:12, Zebediah Figura wrote:
On 2/10/23 00:18, Rémi Bernon wrote:
(1) Why does setting channel->flags affect performance at all? I'm thoroughly baffled by this one.
(2) Why was this diff not applied in the first place? (Simple oversight?)
(3) Why does __WINE_DBCL_INIT even exist, if its only purpose is to guard out a single assignment?
--Zeb
Probably to allow changing the selected channels dynamically, through process monitor for instance.
I'm not sure which question this answers, but I don't think I understand it? winedbg and taskmgr just set the flags directly; they don't touch __WINE_DBCL_INIT, and I don't see why they'd need to.
They can't change the flags in each of the translation units debug channels. What they do is change the debug options in the process PEB.
In order to support the runtime modification, the TU debug channels need to keep their __WINE_DBCL_INIT bit set, so __wine_dbg_get_channel_flags is always called, and the debug option updated from the PEB.
Only channels which aren't explicitly specified clear their __WINE_DBCL_INIT bit, after copying the default debug options.
The default options (all) cannot be modified dynamically, or at least only takes effect the first time __wine_dbg_get_channel_flags is called for a given TU debug channel.
On 2/10/23 20:04, Rémi Bernon wrote:
On 2/10/23 19:12, Zebediah Figura wrote:
On 2/10/23 00:18, Rémi Bernon wrote:
(1) Why does setting channel->flags affect performance at all? I'm thoroughly baffled by this one.
(2) Why was this diff not applied in the first place? (Simple oversight?)
(3) Why does __WINE_DBCL_INIT even exist, if its only purpose is to guard out a single assignment?
--Zeb
Probably to allow changing the selected channels dynamically, through process monitor for instance.
I'm not sure which question this answers, but I don't think I understand it? winedbg and taskmgr just set the flags directly; they don't touch __WINE_DBCL_INIT, and I don't see why they'd need to.
They can't change the flags in each of the translation units debug channels. What they do is change the debug options in the process PEB.
In order to support the runtime modification, the TU debug channels need to keep their __WINE_DBCL_INIT bit set, so __wine_dbg_get_channel_flags is always called, and the debug option updated from the PEB.
Correction, it's actually not __WINE_DBCL_INIT but rather their default value, which contains all the bits. The logic is otherwise the same, the TU channel bits are only eventually cleared for the default options, for the reasons described here.
Only channels which aren't explicitly specified clear their __WINE_DBCL_INIT bit, after copying the default debug options.
The default options (all) cannot be modified dynamically, or at least only takes effect the first time __wine_dbg_get_channel_flags is called for a given TU debug channel.
On 2/10/23 13:36, Rémi Bernon wrote:
On 2/10/23 20:04, Rémi Bernon wrote:
On 2/10/23 19:12, Zebediah Figura wrote:
On 2/10/23 00:18, Rémi Bernon wrote:
(1) Why does setting channel->flags affect performance at all? I'm thoroughly baffled by this one.
(2) Why was this diff not applied in the first place? (Simple oversight?)
(3) Why does __WINE_DBCL_INIT even exist, if its only purpose is to guard out a single assignment?
--Zeb
Probably to allow changing the selected channels dynamically, through process monitor for instance.
I'm not sure which question this answers, but I don't think I understand it? winedbg and taskmgr just set the flags directly; they don't touch __WINE_DBCL_INIT, and I don't see why they'd need to.
They can't change the flags in each of the translation units debug channels. What they do is change the debug options in the process PEB.
In order to support the runtime modification, the TU debug channels need to keep their __WINE_DBCL_INIT bit set, so __wine_dbg_get_channel_flags is always called, and the debug option updated from the PEB.
Correction, it's actually not __WINE_DBCL_INIT but rather their default value, which contains all the bits. The logic is otherwise the same, the TU channel bits are only eventually cleared for the default options, for the reasons described here.
Only channels which aren't explicitly specified clear their __WINE_DBCL_INIT bit, after copying the default debug options.
The default options (all) cannot be modified dynamically, or at least only takes effect the first time __wine_dbg_get_channel_flags is called for a given TU debug channel.
Thank you, that explains why the logic is arranged like it is.
I'm inclined to think we should find another way to achieve this; spending hours trying to debug a performance drop from disabling logging channels is really not something I want to happen to anyone else. I'll see what I can achieve.
This doesn't explain why __WINE_DBCL_INIT exists, but I can only assume that at this point there's no particularly good reason.
On 2/9/23 18:54, Zebediah Figura wrote:
(1) Why does setting channel->flags affect performance at all? I'm thoroughly baffled by this one.
Evan Tang provided me with the cluebat privately—it's because __WINE_DBCL_TRACE is always set, so we end up always calling __wine_dbg_header() [which will then discard the message]. So we probably really should fix this :x