This is technically an ABI violation, because `true` is represented in SM4 as `0xffffffff`, not `1`.
Is that actually the case? True is ~0u internally while in the shader, but do we know that it's required to pass it that way *to* the shader, or does the shader normalize it? It's hard to find any documentation for this...