Matteo Bruni (@Mystral) commented about dlls/d3dx9_36/tests/texture.c:
- ok(hr == D3D_OK, "Unexpected hr %#lx.\n", hr);
- IDirect3DSurface9_Release(surface);
- ID3DXBuffer_Release(buffer);
- /*
* Wine's JPEG compression quality is worse than native. Native uses 4:2:0
* subsampling which is the same as what we use, but whatever compression
* settings they're using results in a JPEG image that is much closer to
* the original uncompressed surface. This is most apparent in 16x16
* blocks with multiple colors.
*/
- get_texture_surface_readback(device, texture, 0, &rb);
- /* 64-bit Windows has worse JPEG compression. */
- if (sizeof(void *) == 8)
- {
Okay, now this is just weird...
Are the JPEG generated by 32-bit and 64-bit d3dx9 identical as far as header / metadata go? Do they just happen to effectively use two different encoders?
I don't really want to add to the pile further, but I'm kinda curious about d3dx10 / d3dx11 as well...