Hi all,
I was running into some odd behaviour while debugging—specifically, I
found that specifying the d3d channel in WINEDEBUG was cutting my
performance in half, even when disabling logging, and even when it
should have been functionally identical to a run without the d3d channel
specified.
I was able to "fix" this inconsistency with this diff:
diff --git a/dlls/ntdll/thread.c b/dlls/ntdll/thread.c
index 1bd8f900d22..f0e4b824918 100644
--- a/dlls/ntdll/thread.c
+++ b/dlls/ntdll/thread.c
@@ -103,7 +103,11 @@ unsigned char __cdecl __wine_dbg_get_channel_flags(
struct __wine_debug_channel
{
pos = (min + max) / 2;
res = strcmp( channel->name, debug_options[pos].name );
- if (!res) return debug_options[pos].flags;
+ if (!res)
+ {
+ if (channel->flags & (1 << __WINE_DBCL_INIT))
channel->flags = debug_options[pos].flags;
+ return debug_options[pos].flags;
+ }
if (res < 0) max = pos - 1;
else min = pos + 1;
}
Which makes sense to some degree, but leaves three questions unanswered,
which I was hoping someone might be able to cluebat me on:
(1) Why does setting channel->flags affect performance at all? I'm
thoroughly baffled by this one.
(2) Why was this diff not applied in the first place? (Simple oversight?)
(3) Why does __WINE_DBCL_INIT even exist, if its only purpose is to
guard out a single assignment?
--Zeb