On Fri Mar 8 17:52:04 2024 +0000, Stefan Dösinger wrote:
The test passes on the HW device of my way newer Radeon Polaris card on Windows 11. As of now I am unsuccessful in getting my Radeon 9000 from 2002 to produce any pick results. On a quick test run it seems Lego Island is happy with setting *count = 0 in IDirect3DDevice::GetPickRecords. That prevents it from trying to essentially malloc a random amount of memory. But I am not entirely certain, the menu of this game is confusing. I was (intellectually) unable to start a game even on Windows... I am downloading FIGHTING to see if it works on my old machine and to see if it uses a HW device or just the RGB device. Judging by the screenshots it might just software render.
Ok, I got pick results out of my Radeon 9000. Let's just say, it is a bit ... picky.
What I had to do was call BeginScene(); Execute(); EndScene(); before calling Pick. But don't do it from the same stack frame, or it fails. And don't think of having some zeroes on the stack. (Same stack frame: If I move the Execute call into get_pickrecords_() picking fails again)
Lego Island works on this GPU though, with the HW device. I can select the lego plants at least, although clicking them is unreliable. Big thanks to itsmattkc for the video in bug 10729, comment 17. I would never have thought of drag and dropping the photo onto the track, or listening to the Lego dude blather forever.
Long story short? I guess this is yet another weirdo bug in my 20 years old GPU driver in a by then already outdated and rarely used feature. We should probably accept zero pick results as broken in a way that doesn't pollute the test too much.
I attached a diff that made parts of the test work: [urk.diff](/uploads/749fcb673c6a758f7534046413d5e4c7/urk.diff)