Am 14.04.2010 um 19:29 schrieb Henri Verbeet:
I think you want to test if the compiled shader works instead of the exact bytecode. Generating the same bytecode is probably way too hard, fragile to test, and most likely not worth the effort.
Yep, comparing generated bytecode is not going to work. I think I wrote only 3-4 HLSL tests in the compiler, and all those were based on checking the rendering results, similar to the d3d9 visual tests.
I'm not sure about LLVM. On one hand I don't think we want to be writing and maintaining all the optimizations that LLVM can do inside the d3d compiler dll.
My main concern about LLVM in 2008 was that it wasn't capable of some shader specific things like DDX and DDY. Apple only used it to optimize CPU fallback code, not to optimize code sent to the GPU. This might have changed since then, especially if the Gallium3D developers indeed use LLVM, as they planned in the past. I recommend to contact the Mesa developers about their compiler status and plans.
If we can it's certainly an advantage to use an existing compiler engine like LLVM instead of reinventing the wheel. I doubt that the dependency is a big issue. It will probably be required by Mesa as well, so distros will install it anyway. LLVM is available on OSX too.
If we're extra-lucky we can implement the parser and bytecode writer ourselves, and just send the parse tree to LLVM and get an optimized tree back. So LLVM could be optional. If you have it great, otherwise things will still work, but shaders will be less efficient.
HLSL is both a simple and complicated language. In many ways it is easier than average imperative programming language because it doesn't have pointers, references or similar features. In d3d9 it had pretty much no types(only floats, except in special cases). Then there are issues that make things more complicated, for example the special address registers, vector registers and no memory to push/pop registers. If your register allocator can't fit the program in the existing registers you're out of luck.
However, pointers are appearing in GLSL via extensions, and I guess in HLSL it's only a matter of time(maybe dx11 compute shaders...).