On 20 March 2016 at 19:24, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
- FIXME("iface %p, filename %s, texture %p stub!\n", iface, debugstr_a(filename), texture);
- TRACE("iface %p, filename %s, texture %p\n", iface, debugstr_a(filename), texture);
Missing period.
@@ -853,15 +859,19 @@ static HRESULT WINAPI d3drm2_CreateUserVisual(IDirect3DRM2 *iface, static HRESULT WINAPI d3drm2_LoadTexture(IDirect3DRM2 *iface, const char *filename, IDirect3DRMTexture2 **texture) {
- struct d3drm_texture *object;
- struct d3drm *d3drm = impl_from_IDirect3DRM2(iface);
- IDirect3DRMTexture3 *texture3; HRESULT hr;
- FIXME("iface %p, filename %s, texture %p stub!\n", iface, debugstr_a(filename), texture);
- TRACE("iface %p, filename %s, texture %p.\n", iface, debugstr_a(filename), texture);
- if (FAILED(hr = d3drm_texture_create(&object)))
- if (FAILED(hr = IDirect3DRM3_LoadTexture(&d3drm->IDirect3DRM3_iface, filename, &texture3))) return hr;
- *texture = &object->IDirect3DRMTexture2_iface;
hr = IDirect3DRMTexture3_QueryInterface(texture3, &IID_IDirect3DRMTexture2, (void**)texture);
IDirect3DRMTexture3_Release(texture3);
if (FAILED(hr))
return hr;
return D3DRM_OK;
}
This is probably ok, but you may want to consider just introducing a helper function instead. I.e.:
if (FAILED(hr = d3drm_load_texture(d3drm, filename, TRUE, &object))) return hr;
IDirect3DRMTexture2_AddRef(*texture = &object->IDirect3DRMTexture2_iface);
return D3DRM_OK;
What happens when d3drm_texture_load() is called multiple times on the same texture? Should that be possible?
+void d3drm_texture_destroy(struct d3drm_texture *texture);
Missing DECLSPEC_HIDDEN.
+HRESULT d3drm_texture_load(struct d3drm_texture *texture, IDirect3DRM *d3drm, const char *path, BOOL load_upside_down) DECLSPEC_HIDDEN;
This line is a bit long. Try to stick to 120 columns as a maximum, preferably a bit less.
+D3DRMIMAGE *d3drm_create_image(unsigned char *buffer, BITMAPINFO *info, BOOL palette, BOOL upside_down);
Missing DECLSPEC_HIDDEN.
@@ -4103,24 +4105,34 @@ static void test_load_texture(void)
hr = IDirect3DRM_LoadTexture(d3drm1, filename, &texture1); ok(SUCCEEDED(hr), "Test %u: Failed to load texture, hr %#x.\n", i, hr);
ref2 = get_refcount((IUnknown *)d3drm1);
ok(ref2 > ref1, "expected ref2 > ref1, got ref1 = %u, ref2 = %u.\n", ref1, ref2);
The message should mention which test (i.e., "i") failed.
d3drm_img = IDirect3DRMTexture_GetImage(texture1);
todo_wine ok(!!d3drm_img, "Test %u: Failed to get image.\n", i);
ok(!!d3drm_img, "Test %u: Failed to get image.\n", i); if (d3drm_img) test_bitmap_data(i * 4, d3drm_img, FALSE, tests[i].w, tests[i].h, tests[i].palettized);
Now that IDirect3DRMTexture_GetImage() succeeds, you can remove the "if (d3drm_img)".
+void d3drm_texture_destroy(struct d3drm_texture *texture) +{
- if (texture->image)
- {
TRACE("Releasing attached members.\n");
HeapFree(GetProcessHeap(), 0, texture->image->buffer1);
if(texture->image->palette)
Missing space between "if" and "(".
HeapFree(GetProcessHeap(), 0, texture->image->palette);
HeapFree(GetProcessHeap(), 0, texture->image);
This seems a bit suspicious. Supposedly the "image" argument to d3drm3_CreateTexture() can be used to create a texture on an existing image. The documentation isn't very explicit about it, but suggests that the image data isn't copied in that case. So I'm not sure the texture is supposed to be responsible for destroying the image. In general I think it's not entirely clear how things like CreateTexture(), LoadTexture() and InitFromFile() etc. are supposed to interact.
+HRESULT d3drm_texture_load(struct d3drm_texture *texture, IDirect3DRM *d3drm, const char *path, BOOL load_upside_down) +{
- BITMAPINFO *info;
- BITMAPFILEHEADER *bmp_header;
- unsigned char *buffer;
- DWORD size;
- HANDLE hfile, hmapping;
- hfile = CreateFileA(path, GENERIC_READ, FILE_SHARE_READ, 0, OPEN_EXISTING, 0, 0);
- if (hfile == INVALID_HANDLE_VALUE)
return D3DRMERR_BADFILE;
- size = GetFileSize(hfile, NULL);
- if (size == INVALID_FILE_SIZE)
- {
DeleteObject(hfile);
return D3DRMERR_BADVALUE;
- }
You never use "size" after this, but you probably should. I also think you meant CloseHandle() instead of DeleteObject().
- buffer = MapViewOfFile(hmapping, FILE_MAP_READ, 0, 0, 0);
- if (buffer == NULL)
"if (!buffer)", or even "if (!(buffer = MapViewOfFile(...)))"
- bmp_header = (BITMAPFILEHEADER *)buffer;
- info = (BITMAPINFO *) (buffer + sizeof(*bmp_header));
Or just "info = (BITMAPINFO *)bmp_header + 1;". You should validate that this is actually a BMP file. You should also check which version of the BITMAPINFOHEADER structure the file has.
+D3DRMIMAGE *d3drm_create_image(unsigned char *buffer, BITMAPINFO *info, BOOL palette, BOOL upside_down) +{
- D3DRMPALETTEENTRY *colors = (D3DRMPALETTEENTRY *)HeapAlloc(GetProcessHeap(), HEAP_ZERO_MEMORY, 257 * sizeof(*colors));
Is "257" there intentional? The cast is unnecessary. HeapAlloc() can fail. Do you need HEAP_ZERO_MEMORY here?
- D3DRMIMAGE *image;
- BOOL black_used = FALSE;
- LONG w = info->bmiHeader.biWidth;
- LONG h = abs(info->bmiHeader.biHeight);
- int i, j, k;
- unsigned int idx, buffer1_idx;
- unsigned char *buffer1;
- int bpp = info->bmiHeader.biBitCount == 24 ? 32 : info->bmiHeader.biBitCount;
- int bpl = palette ? w : ((w + 3) & ~3) * bpp / 8;
- int num_colors = 0;
- struct colors_24bpp
- {
BYTE red;
BYTE green;
BYTE blue;
- } *color;
- color = (struct colors_24bpp *)buffer;
- image = (D3DRMIMAGE *)HeapAlloc(GetProcessHeap(), HEAP_ZERO_MEMORY, sizeof(*image));
- image->buffer1 = HeapAlloc(GetProcessHeap(), HEAP_ZERO_MEMORY, sizeof(unsigned char) * bpl * h);
This is unsafe. Consider e.g. what happens when the header gives you 0x10000 for width and height.
- buffer1 = image->buffer1;
- memset(buffer1, 255, sizeof(unsigned char) * bpl * h);
You just initialised the memory with HEAP_ZERO_MEMORY, and now you're overwriting it again. At least one of these two initialisations is unnecessary, and quite possibly both of them.
On Wed, Mar 23, 2016 at 6:50 PM, Henri Verbeet hverbeet@gmail.com wrote:
What happens when d3drm_texture_load() is called multiple times on the same texture? Should that be possible?
On windows, IDirect3DRMTexture*::Load creates a new texture every time you call it on the same texture pointer, effectively losing the old one and creating a leak. The refcount also increases by one every time a new texture is created. So if windows allows this, I guess it's fine. Do let me know if I've missed testing something here though.
HeapFree(GetProcessHeap(), 0, texture->image->palette);
HeapFree(GetProcessHeap(), 0, texture->image);
This seems a bit suspicious. Supposedly the "image" argument to d3drm3_CreateTexture() can be used to create a texture on an existing image. The documentation isn't very explicit about it, but suggests that the image data isn't copied in that case. So I'm not sure the texture is supposed to be responsible for destroying the image. In general I think it's not entirely clear how things like CreateTexture(), LoadTexture() and InitFromFile() etc. are supposed to interact.
AFAICS on the trace logs of d3drm games ran so far, I haven't encountered CreateTexture spefically. Though you can find usage of CreateTexture in the "Trans" demo from the DX7 SDK, within trans.cpp. After a quick glance I can confirm that it does indeed create a texture on an existing image in memory. The implementation for this should be fairly trivial, but some tests would be needed to check if it does a shallow/deep copy of the image struct (as you mentioned), validation checks, if any, and its refcounting behavior.
I'm not sure about the Init methods, but so far I haven't been able to see them being used anywhere.
I've initially decided to prioritize on functions that are actually used by games rather than implementing functions that are probably not used at all in the d3drm apps that ran on wine so far. But I don't mind doing them if they're required somehow, or if a bug reports it.
- size = GetFileSize(hfile, NULL);
- if (size == INVALID_FILE_SIZE)
- {
DeleteObject(hfile);
return D3DRMERR_BADVALUE;
- }
You never use "size" after this, but you probably should. I also think you meant CloseHandle() instead of DeleteObject().
Right, I could use it to check if the size matches the size specifed in the header field bfSize within BITMAPFILEHEADER. Or maybe there's another use than this?
+D3DRMIMAGE *d3drm_create_image(unsigned char *buffer, BITMAPINFO *info, BOOL palette, BOOL upside_down)
+{
- D3DRMPALETTEENTRY *colors = (D3DRMPALETTEENTRY
*)HeapAlloc(GetProcessHeap(), HEAP_ZERO_MEMORY, 257 * sizeof(*colors)); Is "257" there intentional? The cast is unnecessary. HeapAlloc() can fail. Do you need HEAP_ZERO_MEMORY here?
I kept it as 257 so that accessing colors[256] shouldn't segfault. The approach I used here is to check if the total number of unique colors is greater than 256, which happens when the 257th unique color is also stored into the array.
I guess statically allocating it on the stack i.e declaring D3DRMPALETTEENTRY colors[257]; could possibly help avoid the HeapAlloc failures. Makes sense since the 257 is hard-coded anyways. Should I do that? Or is the HeapAlloc a better approach?
As for HEAP_ZERO_MEMORY, see below.
This is unsafe. Consider e.g. what happens when the header gives you 0x10000 for width and height.
You're right. d3drm returns D3DRMERR_BADVALUE for textures with width and height that large. I'll add a test accordingly. I'll also be adding validation checks for bitmap format from the header, as you mentioned.
- buffer1 = image->buffer1;
- memset(buffer1, 255, sizeof(unsigned char) * bpl * h);
You just initialised the memory with HEAP_ZERO_MEMORY, and now you're overwriting it again. At least one of these two initialisations is unnecessary, and quite possibly both of them.
I actually intended to initialize it to 0xFF rather than zero. But you're
right about the alloc's being unnecessary. I'll remove both initializations, and keep the default initialization to 0xFF as the byte padding in 32bpp bitmaps are all 0xFF.
I also agree with the rest of the changes and will submit another patch very soon.
Thanks for the detailed review! Really appreciate it.
Cheers, Aaryaman
On 23 March 2016 at 20:05, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
On windows, IDirect3DRMTexture*::Load creates a new texture every time you call it on the same texture pointer, effectively losing the old one and creating a leak. The refcount also increases by one every time a new texture is created. So if windows allows this, I guess it's fine. Do let me know if I've missed testing something here though.
So the image can't be replaced after it has been set.
AFAICS on the trace logs of d3drm games ran so far, I haven't encountered CreateTexture spefically. Though you can find usage of CreateTexture in the "Trans" demo from the DX7 SDK, within trans.cpp. After a quick glance I can confirm that it does indeed create a texture on an existing image in memory. The implementation for this should be fairly trivial, but some tests would be needed to check if it does a shallow/deep copy of the image struct (as you mentioned), validation checks, if any, and its refcounting behavior.
I'm not sure about the Init methods, but so far I haven't been able to see them being used anywhere.
I've initially decided to prioritize on functions that are actually used by games rather than implementing functions that are probably not used at all in the d3drm apps that ran on wine so far. But I don't mind doing them if they're required somehow, or if a bug reports it.
Sure, but it's useful to have some idea what those functions are supposed to do. Right now, I think it looks like you should be able to implement LoadTexture() by creating a texture and then calling InitFromFile() on it. And related to above, InitFromFile() should then probably check that the texture hasn't been initialised before.
Right, I could use it to check if the size matches the size specifed in the header field bfSize within BITMAPFILEHEADER. Or maybe there's another use than this?
Mostly to make sure you don't read past the end of the buffer.
+D3DRMIMAGE *d3drm_create_image(unsigned char *buffer, BITMAPINFO *info, BOOL palette, BOOL upside_down) +{
- D3DRMPALETTEENTRY *colors = (D3DRMPALETTEENTRY
*)HeapAlloc(GetProcessHeap(), HEAP_ZERO_MEMORY, 257 * sizeof(*colors));
Is "257" there intentional? The cast is unnecessary. HeapAlloc() can fail. Do you need HEAP_ZERO_MEMORY here?
I kept it as 257 so that accessing colors[256] shouldn't segfault. The approach I used here is to check if the total number of unique colors is greater than 256, which happens when the 257th unique color is also stored into the array.
It's probably better to just avoid writing it.
I guess statically allocating it on the stack i.e declaring D3DRMPALETTEENTRY colors[257]; could possibly help avoid the HeapAlloc failures. Makes sense since the 257 is hard-coded anyways. Should I do that? Or is the HeapAlloc a better approach?
HeapAlloc() is fine, you just need to handle the failure.
On Fri, Mar 25, 2016 at 1:12 AM, Henri Verbeet hverbeet@gmail.com wrote:
Sure, but it's useful to have some idea what those functions are supposed to do. Right now, I think it looks like you should be able to implement LoadTexture() by creating a texture and then calling InitFromFile() on it. And related to above, InitFromFile() should then probably check that the texture hasn't been initialised before.
I wrote a quick test, and it seems that InitFromFile always returns D3DRMERR_BADOBJECT, even after passing it an IDirect3DRMTexture created via CreateTexture. It even fails on version 3 (my suspicion was that maybe version 1 and 2 were deprecated and version 3 works, but that doesn't seem the case). I ran them on my windows machine, but I can send a diff to the testbot if you want.
CreateTexture will also only work when it's fed with a complete and valid image struct. Passing NULL to buffer1 made it fail again. I can confirm that CreateTexture only keeps a pointer copy to the application-created image struct, instead of deep copying the contents of it in seperate memory.
The docs have also been pretty vague about the type of file that InitFromFile wants, describing the filename parameter as "A string that specifies the file from which the texture initialization information is drawn". I suspect that it might be some proprietary format specifying the contents of an image struct, along with the buffer arrays. But I could be wrong. For now I'll include all your other suggestions and leave this for further discussion, while I submit the improved patch.
Thanks for the review and clarifications!
Cheers, Aaryaman
On 25 March 2016 at 22:20, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
I wrote a quick test, and it seems that InitFromFile always returns D3DRMERR_BADOBJECT, even after passing it an IDirect3DRMTexture created via CreateTexture.
How are you creating the texture? The interesting cases are probably CoCreateInstance() and IDirect3DRM_CreateObject().
For now I used CreateTexture (as the documentation suggested) another way is CreateTextureFromSurface, but that didn't work either (after passing it an off-screen plain surface).
I called IDirect3DRM3_CreateObject(d3drm3, &CLSID_CDirect3DRMTexture, NULL, &IID_IDirect3DRMTexture3, &texture3), which works, then calling InitFromFile on texture3 also returns successfully. It seems that if the image struct is not set within the object, only then InitFromFile works (as InitFromFile didn't work when called on a texture created from LoadTexture or CreateTexture, both of which initialize the image struct member).
On Sat, Mar 26, 2016 at 3:03 AM, Henri Verbeet hverbeet@gmail.com wrote:
On 25 March 2016 at 22:20, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
I wrote a quick test, and it seems that InitFromFile always returns D3DRMERR_BADOBJECT, even after passing it an IDirect3DRMTexture created
via
CreateTexture.
How are you creating the texture? The interesting cases are probably CoCreateInstance() and IDirect3DRM_CreateObject().
What can probably be done is, making InitFromFile call d3drm_create_image, checking if D3DRMIMAGE struct isn't set already, and LoadTexture will call InitFromFIle (on a fresh new object every time LoadTexture is called), whereas CreateTexture will merely HeapAlloc the object and assign the image supplied by the application.
On Sat, Mar 26, 2016 at 2:46 PM, Aaryaman Vasishta < jem456.vasishta@gmail.com> wrote:
For now I used CreateTexture (as the documentation suggested) another way is CreateTextureFromSurface, but that didn't work either (after passing it an off-screen plain surface).
I called IDirect3DRM3_CreateObject(d3drm3, &CLSID_CDirect3DRMTexture, NULL, &IID_IDirect3DRMTexture3, &texture3), which works, then calling InitFromFile on texture3 also returns successfully. It seems that if the image struct is not set within the object, only then InitFromFile works (as InitFromFile didn't work when called on a texture created from LoadTexture or CreateTexture, both of which initialize the image struct member).
On Sat, Mar 26, 2016 at 3:03 AM, Henri Verbeet hverbeet@gmail.com wrote:
On 25 March 2016 at 22:20, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
I wrote a quick test, and it seems that InitFromFile always returns D3DRMERR_BADOBJECT, even after passing it an IDirect3DRMTexture created
via
CreateTexture.
How are you creating the texture? The interesting cases are probably CoCreateInstance() and IDirect3DRM_CreateObject().
On 26 March 2016 at 10:19, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
What can probably be done is, making InitFromFile call d3drm_create_image, checking if D3DRMIMAGE struct isn't set already, and LoadTexture will call InitFromFIle (on a fresh new object every time LoadTexture is called), whereas CreateTexture will merely HeapAlloc the object and assign the image supplied by the application.
CreateTexture() should probably call InitFromImage(), or some equivalent, although it's odd that that only exists since IDirect3DRMTexture2. It would also be interesting to know if CoCreateObject() works, mainly because you can't pass the IDirect3DRM interface to the texture in that case.
By CoCreateObject(), did you mean CoCreateInstance()? I couldn't find CoCreateObject in the documentation or MSDN.
CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_SERVER, &IID_IDirect3DRMTexture3, &texture3) returns REGDB_E_CLASSNOTREG. Am I missing something?
On Sat, Mar 26, 2016 at 4:38 PM, Henri Verbeet hverbeet@gmail.com wrote:
On 26 March 2016 at 10:19, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
What can probably be done is, making InitFromFile call
d3drm_create_image,
checking if D3DRMIMAGE struct isn't set already, and LoadTexture will
call
InitFromFIle (on a fresh new object every time LoadTexture is called), whereas CreateTexture will merely HeapAlloc the object and assign the
image
supplied by the application.
CreateTexture() should probably call InitFromImage(), or some equivalent, although it's odd that that only exists since IDirect3DRMTexture2. It would also be interesting to know if CoCreateObject() works, mainly because you can't pass the IDirect3DRM interface to the texture in that case.
On 26 March 2016 at 12:48, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
By CoCreateObject(), did you mean CoCreateInstance()? I couldn't find CoCreateObject in the documentation or MSDN.
Right, CoCreateInstance().
CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_SERVER, &IID_IDirect3DRMTexture3, &texture3) returns REGDB_E_CLASSNOTREG. Am I missing something?
Does it work for IDirect3DRM? It may not be supposed to work, although it's also possible d3drm.dll is just not properly registered.
CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_SERVER, &IID_IDirect3DRMTexture3, &texture3) returns REGDB_E_CLASSNOTREG. Am I missing something?
Does it work for IDirect3DRM? It may not be supposed to work, although it's also possible d3drm.dll is just not properly registered.
If you copied d3drm.dll from XP you'll most likely have to register it with regsvr32.
"regsvr32.exe d3drm.dll" returns an error saying DllRegisterServer was not found.
I compiled the test below and ran it on an XP VM (which had the dx7 SDK installed) just to be sure that my native win8.1 box wasn't the issue:
CoInitialize(NULL);
hr = CoCreateInstance(&CLSID_CDirect3DRM, NULL, CLSCTX_INPROC_HANDLER, &IID_IDirect3DRM, &d3drm); ok(hr == D3DRM_OK, "Got %#x for IID_IDirect3DRM\n", hr); hr = CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_HANDLER, &IID_IDirect3DRMTexture3, &texture3); ok(hr == D3DRM_OK, "Got %#x for IID_IDirect3DRMTexture3\n", hr);
Got the following results:
d3drm.c:4309: Test failed: Got 0x80040154 for IID_IDirect3DRM
d3drm.c:4311: Test failed: Got 0x80040154 for IID_IDirect3DRMTexture3
On Sat, Mar 26, 2016 at 7:55 PM, Stefan Dösinger stefandoesinger@gmail.com wrote:
CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_SERVER, &IID_IDirect3DRMTexture3, &texture3) returns REGDB_E_CLASSNOTREG. Am I missing something?
Does it work for IDirect3DRM? It may not be supposed to work, although it's also possible d3drm.dll is just not properly registered.
If you copied d3drm.dll from XP you'll most likely have to register it with regsvr32.
Some notes:
I played around with IDirect3DRM::CreateObject a bit, and it also returns REGDB_E_CLASSNOTREG if the proper CLSID is not set. Another interesting note is that passing the same CLSID that worked for CreateObject to CoCreateInstance makes it fail. The SDK setup does in fact add the CLSID's to the registry - it seems that the setup reads d3drm.inf, creates a couple .reg files and runs that to create the registry entries. The only registry entry related to d3drm which the setup adds is "Direct3DRMObject" whose value is {4516EC41-8F20-11d0-9B6D-0000C0781BC3}, which matches that of CLSID_CDirect3DRM. I installed this SDK on my XP VM, yet the tests fail, hmm.
I played around with different CLSCTX values as well - no luck so far.
Cheers, Aaryaman
On Sat, Mar 26, 2016 at 8:09 PM, Aaryaman Vasishta < jem456.vasishta@gmail.com> wrote:
"regsvr32.exe d3drm.dll" returns an error saying DllRegisterServer was not found.
I compiled the test below and ran it on an XP VM (which had the dx7 SDK installed) just to be sure that my native win8.1 box wasn't the issue:
CoInitialize(NULL);
hr = CoCreateInstance(&CLSID_CDirect3DRM, NULL, CLSCTX_INPROC_HANDLER, &IID_IDirect3DRM, &d3drm); ok(hr == D3DRM_OK, "Got %#x for IID_IDirect3DRM\n", hr); hr = CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_HANDLER, &IID_IDirect3DRMTexture3, &texture3); ok(hr == D3DRM_OK, "Got %#x for IID_IDirect3DRMTexture3\n", hr);
Got the following results:
d3drm.c:4309: Test failed: Got 0x80040154 for IID_IDirect3DRM
d3drm.c:4311: Test failed: Got 0x80040154 for IID_IDirect3DRMTexture3
On Sat, Mar 26, 2016 at 7:55 PM, Stefan Dösinger < stefandoesinger@gmail.com> wrote:
CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_SERVER, &IID_IDirect3DRMTexture3, &texture3) returns REGDB_E_CLASSNOTREG. Am I missing something?
Does it work for IDirect3DRM? It may not be supposed to work, although it's also possible d3drm.dll is just not properly registered.
If you copied d3drm.dll from XP you'll most likely have to register it with regsvr32.
Err, it's "Direct3DRM Object", instead of "Direct3DRMObject", which I wrote by mistake.
On Sat, Mar 26, 2016 at 11:42 PM, Aaryaman Vasishta < jem456.vasishta@gmail.com> wrote:
Some notes:
I played around with IDirect3DRM::CreateObject a bit, and it also returns REGDB_E_CLASSNOTREG if the proper CLSID is not set. Another interesting note is that passing the same CLSID that worked for CreateObject to CoCreateInstance makes it fail. The SDK setup does in fact add the CLSID's to the registry - it seems that the setup reads d3drm.inf, creates a couple .reg files and runs that to create the registry entries. The only registry entry related to d3drm which the setup adds is "Direct3DRMObject" whose value is {4516EC41-8F20-11d0-9B6D-0000C0781BC3}, which matches that of CLSID_CDirect3DRM. I installed this SDK on my XP VM, yet the tests fail, hmm.
I played around with different CLSCTX values as well - no luck so far.
Cheers, Aaryaman
On Sat, Mar 26, 2016 at 8:09 PM, Aaryaman Vasishta < jem456.vasishta@gmail.com> wrote:
"regsvr32.exe d3drm.dll" returns an error saying DllRegisterServer was not found.
I compiled the test below and ran it on an XP VM (which had the dx7 SDK installed) just to be sure that my native win8.1 box wasn't the issue:
CoInitialize(NULL);
hr = CoCreateInstance(&CLSID_CDirect3DRM, NULL, CLSCTX_INPROC_HANDLER, &IID_IDirect3DRM, &d3drm); ok(hr == D3DRM_OK, "Got %#x for IID_IDirect3DRM\n", hr); hr = CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_HANDLER, &IID_IDirect3DRMTexture3, &texture3); ok(hr == D3DRM_OK, "Got %#x for IID_IDirect3DRMTexture3\n", hr);
Got the following results:
d3drm.c:4309: Test failed: Got 0x80040154 for IID_IDirect3DRM
d3drm.c:4311: Test failed: Got 0x80040154 for IID_IDirect3DRMTexture3
On Sat, Mar 26, 2016 at 7:55 PM, Stefan Dösinger < stefandoesinger@gmail.com> wrote:
CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL,
CLSCTX_INPROC_SERVER,
&IID_IDirect3DRMTexture3, &texture3) returns REGDB_E_CLASSNOTREG. Am I missing something?
Does it work for IDirect3DRM? It may not be supposed to work, although it's also possible d3drm.dll is just not properly registered.
If you copied d3drm.dll from XP you'll most likely have to register it with regsvr32.
What should be the max. resolution for the texture? I'm getting D3DRMERR_BADALLOC if the resolution is >= 13K x 13K @ 24bpp on my win8.1 machine with 16GB RAM. Maybe the max resolution might depend on the system? I think setting a max resolution of 4K should be fine, I guess.
Cheers, Aaryaman
On Sat, Mar 26, 2016 at 11:54 PM, Aaryaman Vasishta < jem456.vasishta@gmail.com> wrote:
Err, it's "Direct3DRM Object", instead of "Direct3DRMObject", which I wrote by mistake.
On Sat, Mar 26, 2016 at 11:42 PM, Aaryaman Vasishta < jem456.vasishta@gmail.com> wrote:
Some notes:
I played around with IDirect3DRM::CreateObject a bit, and it also returns REGDB_E_CLASSNOTREG if the proper CLSID is not set. Another interesting note is that passing the same CLSID that worked for CreateObject to CoCreateInstance makes it fail. The SDK setup does in fact add the CLSID's to the registry - it seems that the setup reads d3drm.inf, creates a couple .reg files and runs that to create the registry entries. The only registry entry related to d3drm which the setup adds is "Direct3DRMObject" whose value is {4516EC41-8F20-11d0-9B6D-0000C0781BC3}, which matches that of CLSID_CDirect3DRM. I installed this SDK on my XP VM, yet the tests fail, hmm.
I played around with different CLSCTX values as well - no luck so far.
Cheers, Aaryaman
On Sat, Mar 26, 2016 at 8:09 PM, Aaryaman Vasishta < jem456.vasishta@gmail.com> wrote:
"regsvr32.exe d3drm.dll" returns an error saying DllRegisterServer was not found.
I compiled the test below and ran it on an XP VM (which had the dx7 SDK installed) just to be sure that my native win8.1 box wasn't the issue:
CoInitialize(NULL);
hr = CoCreateInstance(&CLSID_CDirect3DRM, NULL, CLSCTX_INPROC_HANDLER, &IID_IDirect3DRM, &d3drm); ok(hr == D3DRM_OK, "Got %#x for IID_IDirect3DRM\n", hr); hr = CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL, CLSCTX_INPROC_HANDLER, &IID_IDirect3DRMTexture3, &texture3); ok(hr == D3DRM_OK, "Got %#x for IID_IDirect3DRMTexture3\n", hr);
Got the following results:
d3drm.c:4309: Test failed: Got 0x80040154 for IID_IDirect3DRM
d3drm.c:4311: Test failed: Got 0x80040154 for IID_IDirect3DRMTexture3
On Sat, Mar 26, 2016 at 7:55 PM, Stefan Dösinger < stefandoesinger@gmail.com> wrote:
CoCreateInstance(&CLSID_CDirect3DRMTexture, NULL,
CLSCTX_INPROC_SERVER,
&IID_IDirect3DRMTexture3, &texture3) returns REGDB_E_CLASSNOTREG. Am
I
missing something?
Does it work for IDirect3DRM? It may not be supposed to work, although it's also possible d3drm.dll is just not properly registered.
If you copied d3drm.dll from XP you'll most likely have to register it with regsvr32.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256
Am 2016-03-27 um 14:17 schrieb Aaryaman Vasishta:
What should be the max. resolution for the texture? I'm getting D3DRMERR_BADALLOC if the resolution is >= 13K x 13K @ 24bpp on my win8.1 machine with 16GB RAM. Maybe the max resolution might depend on the system? I think setting a max resolution of 4K should be fine, I guess.
Probably the max d3d texture size?
Someone sent tests a long time ago that showed that the max ddraw surface size is 65535 for sysmem (and presumably HW if the HW limit is beyond this in d3d8+).
At some point you'll hit the 2G address space limit. There's also a chance that even if enough address space is free it is fragmented too much to support your texture.
The max resolution I tried which passed the tests was greater than 65535, hmm. Am I missing something?
On Sun, Mar 27, 2016 at 9:20 PM, Stefan Dösinger stefandoesinger@gmail.com wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256
Am 2016-03-27 um 14:17 schrieb Aaryaman Vasishta:
What should be the max. resolution for the texture? I'm getting D3DRMERR_BADALLOC if the resolution is >= 13K x 13K @ 24bpp on my win8.1 machine with 16GB RAM. Maybe the max resolution might depend on the system? I think setting a max resolution of 4K should be fine, I guess.
Probably the max d3d texture size?
Someone sent tests a long time ago that showed that the max ddraw surface size is 65535 for sysmem (and presumably HW if the HW limit is beyond this in d3d8+).
At some point you'll hit the 2G address space limit. There's also a chance that even if enough address space is free it is fragmented too much to support your texture. -----BEGIN PGP SIGNATURE----- Version: GnuPG v2
iQIcBAEBCAAGBQJW+AEqAAoJEN0/YqbEcdMwu7EP/3cWlguPiED/CkHNyK2CiWZm CazSETN6zor6pwRN32yaofQZ1nv3LMt2TlQ4EfYQVvXoZj1kCj9yoxPOwGLN4cgl /hzMGPXPGc8A65z3gbgZot2D/CcRl4Bmxx54XwZr7ycGHFCMnG+iS0KJdHc1DUMS HkQ2Nb0CSUwf4ZV4Xxyx0OP3UEE7usqN1J9jxuLmRt1tMyfXfdsXWD+VWBl/A8RF pRSvXWqWoMmPb5J+0FBbdrwDQdhJT/+NVm8t46C6c+AMt+KTl1SSO41DGix2Loav NzELZuFGF7pnwhiFvuf04UfvFF8MHWAal9O7c7fXqwD0MXKt9kYokzqL+p8LIiOj wuRXF/A9atvg9PfPcuFslWctN0hUIc3vIDsvS9a6vBjB9oApavjb9R7y1g5rpLY6 nku+LHLKTbWtt81JzgOCxNB8vTfaEwKjefoy4PRq+YFPsNmtCL11P9dnaUe7LCsl ZXqDzUVcMRwjfsEg879k67gTdt+fsjdPpJQrm6VnoQTJxQRC6KTySwcNBLHqxWOJ Ch8oaBzk9ByKtL9AwTrt0/YuSKv5IRVrv+tyb6xmXo5YJHYRl6HQ5hHaUPIBPTeh E4OYoGSkPtqP4hDtDh4tBucDNwxhd0VkZFgMMAc2DI9izxWmasI5+85HGkHkINlp aSyJHCbU0fqp9Meg1w8A =FCyH -----END PGP SIGNATURE-----
On 27 March 2016 at 20:23, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
The max resolution I tried which passed the tests was greater than 65535, hmm. Am I missing something?
What were the dimensions you tried that succeeded exactly? 13k x 13k @ 24bpp would mean an allocation of about 640 MiB, which is a considerable part of the 2 GiB address space you get (by default) as a 32-bit process. It's perhaps not quite large enough that it's guaranteed to fail, but I don't think it's all that surprising if there no contiguous allocation of that size available.
It's not so clear if d3drm is concerned with hardware limitations at all. The more immediate/obvious issue there is is that if you're storing the image size in a 32-bit unsigned integer, the maximum size a square texture can have is 0xffff x 0xffff, since 0x10000 squared is 0x100000000. That's for a 8 bpp format, for 32 bpp it goes down to 0x7fff x 0x7fff. On the other hand, a non-square 32 bpp texture could e.g. be 0x3fffffff x 0x1. I.e., your immediate concern is that you don't overflow UINT_MAX when calculating the allocation size for the image, and there's a good chance the rest will follow naturally from that.
Not being able to instantiate textures with CoCreateInstance() makes sense, but it's good to verify. Not being able to instantiate IDirect3DRM objects is more surprising, particularly since the documentation explicitly mentions it as something that's supposed to work. It may make a difference to use IID_IDirect3DRM2 or IID_IDirect3DRM3 instead of IID_IDirect3DRM, I wouldn't put it past d3drm to be picky like that. It might also be worth a try to call DllGetClassObject() directly, to get a better idea of what works and what doesn't. At this point all that is mostly just curiosity though.
The main conclusion so far is that I think we'll want to implement file loading in InitFromFile(). While I think using d3drm_texture_create() in LoadTexture() makes sense, implementing CreateObject() should help for writing tests for InitFromFile(). InitFromImage() looks like it should be fairly trivial to implement, maybe it makes sense to implement that before InitFromFile().
InitFrom* tests would require CreateObject to work, for which I am thinking of implementing as a seperate series of patches. For now I'm thinking of implementing InitFromFile for use by LoadTexture. Will that be okay? I could add todo_wine tests for InitFromFile, and once CreateObject is implemented, the todo's can be removed.
Cheers, Aaryaman On Mon, Mar 28, 2016 at 5:34 AM, Henri Verbeet hverbeet@gmail.com wrote:
On 27 March 2016 at 20:23, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
The max resolution I tried which passed the tests was greater than 65535, hmm. Am I missing something?
What were the dimensions you tried that succeeded exactly? 13k x 13k @ 24bpp would mean an allocation of about 640 MiB, which is a considerable part of the 2 GiB address space you get (by default) as a 32-bit process. It's perhaps not quite large enough that it's guaranteed to fail, but I don't think it's all that surprising if there no contiguous allocation of that size available.
It's not so clear if d3drm is concerned with hardware limitations at all. The more immediate/obvious issue there is is that if you're storing the image size in a 32-bit unsigned integer, the maximum size a square texture can have is 0xffff x 0xffff, since 0x10000 squared is 0x100000000. That's for a 8 bpp format, for 32 bpp it goes down to 0x7fff x 0x7fff. On the other hand, a non-square 32 bpp texture could e.g. be 0x3fffffff x 0x1. I.e., your immediate concern is that you don't overflow UINT_MAX when calculating the allocation size for the image, and there's a good chance the rest will follow naturally from that.
Not being able to instantiate textures with CoCreateInstance() makes sense, but it's good to verify. Not being able to instantiate IDirect3DRM objects is more surprising, particularly since the documentation explicitly mentions it as something that's supposed to work. It may make a difference to use IID_IDirect3DRM2 or IID_IDirect3DRM3 instead of IID_IDirect3DRM, I wouldn't put it past d3drm to be picky like that. It might also be worth a try to call DllGetClassObject() directly, to get a better idea of what works and what doesn't. At this point all that is mostly just curiosity though.
The main conclusion so far is that I think we'll want to implement file loading in InitFromFile(). While I think using d3drm_texture_create() in LoadTexture() makes sense, implementing CreateObject() should help for writing tests for InitFromFile(). InitFromImage() looks like it should be fairly trivial to implement, maybe it makes sense to implement that before InitFromFile().
On 28 March 2016 at 17:56, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
InitFrom* tests would require CreateObject to work, for which I am thinking of implementing as a seperate series of patches. For now I'm thinking of implementing InitFromFile for use by LoadTexture. Will that be okay? I could add todo_wine tests for InitFromFile, and once CreateObject is implemented, the todo's can be removed.
It might be ok. I'm not sure it's all that much easier than just implementing CreateObject() first though. It would also mean the Wine implementation is more or less untested until CreateObject() is implemented. The main reason I can think of for implementing InitFromFile() first is that it would allow you to send the code you've already written a little bit sooner, but I don't think that's a very convincing argument.
I sent a patch with InitFromFile implemented first. I'll keep it aside and implement CreateObject first if that's what you want. After you're okay with the CreateObject patches, you can sign-off on the LoadTexture patches later on. (after they're somehow integrated with the CreateObject patches below it).
Cheers, Aaryaman
On Mon, Mar 28, 2016 at 9:40 PM, Henri Verbeet hverbeet@gmail.com wrote:
On 28 March 2016 at 17:56, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
InitFrom* tests would require CreateObject to work, for which I am
thinking
of implementing as a seperate series of patches. For now I'm thinking of implementing InitFromFile for use by LoadTexture. Will that be okay? I
could
add todo_wine tests for InitFromFile, and once CreateObject is
implemented,
the todo's can be removed.
It might be ok. I'm not sure it's all that much easier than just implementing CreateObject() first though. It would also mean the Wine implementation is more or less untested until CreateObject() is implemented. The main reason I can think of for implementing InitFromFile() first is that it would allow you to send the code you've already written a little bit sooner, but I don't think that's a very convincing argument.
On Mon, Mar 28, 2016 at 5:34 AM, Henri Verbeet hverbeet@gmail.com wrote:
On 27 March 2016 at 20:23, Aaryaman Vasishta jem456.vasishta@gmail.com wrote:
The max resolution I tried which passed the tests was greater than 65535, hmm. Am I missing something?
What were the dimensions you tried that succeeded exactly? 13k x 13k @ 24bpp would mean an allocation of about 640 MiB, which is a considerable part of the 2 GiB address space you get (by default) as a 32-bit process. It's perhaps not quite large enough that it's guaranteed to fail, but I don't think it's all that surprising if there no contiguous allocation of that size available.
I used a trial-and-error approach to find upto which resolution I could load a texture (around 12K x 12K which worked), which, as it seems, is not the correct way. That value is dependent on the hardware capabilities and fragmentation state which isn't generic enough to be considered as a hard limit on the bitmap size.
Thanks for explaining about UINT_MAX for the buffer allocation! I'll include this as a check before loading the bitmap.
Cheers, Aaryaman