On 2/10/23 13:36, Rémi Bernon wrote:
On 2/10/23 20:04, Rémi Bernon wrote:
On 2/10/23 19:12, Zebediah Figura wrote:
On 2/10/23 00:18, Rémi Bernon wrote:
(1) Why does setting channel->flags affect performance at all? I'm thoroughly baffled by this one.
(2) Why was this diff not applied in the first place? (Simple oversight?)
(3) Why does __WINE_DBCL_INIT even exist, if its only purpose is to guard out a single assignment?
--Zeb
Probably to allow changing the selected channels dynamically, through process monitor for instance.
I'm not sure which question this answers, but I don't think I understand it? winedbg and taskmgr just set the flags directly; they don't touch __WINE_DBCL_INIT, and I don't see why they'd need to.
They can't change the flags in each of the translation units debug channels. What they do is change the debug options in the process PEB.
In order to support the runtime modification, the TU debug channels need to keep their __WINE_DBCL_INIT bit set, so __wine_dbg_get_channel_flags is always called, and the debug option updated from the PEB.
Correction, it's actually not __WINE_DBCL_INIT but rather their default value, which contains all the bits. The logic is otherwise the same, the TU channel bits are only eventually cleared for the default options, for the reasons described here.
Only channels which aren't explicitly specified clear their __WINE_DBCL_INIT bit, after copying the default debug options.
The default options (all) cannot be modified dynamically, or at least only takes effect the first time __wine_dbg_get_channel_flags is called for a given TU debug channel.
Thank you, that explains why the logic is arranged like it is.
I'm inclined to think we should find another way to achieve this; spending hours trying to debug a performance drop from disabling logging channels is really not something I want to happen to anyone else. I'll see what I can achieve.
This doesn't explain why __WINE_DBCL_INIT exists, but I can only assume that at this point there's no particularly good reason.