On Mon, Apr 19, 2010 at 1:44 PM, Stefan Dösinger stefandoesinger@gmx.at wrote:
Am 18.04.2010 um 20:42 schrieb Roderick Colenbrander:
Implementation details aside it would be very useful if a mechanism can take advantage of GPU monitor APIs like the ones from nvidia perfkit and GL_AMD_performance_monitor. This would also allow us to monitor what is holding up the GPU.
OSX also has a good tools for this purpose. Mostly the GL Profiler, Driver Monitor(similar to nvperfkit) and Shark(similar to oprofile).
I think code for those performance tools should be left out of wined3d. The Apple tools and I think nvperfkit are designed to be used by other processes, not the profiled process. I am not sure about AMD_performance_monitor though. That may be trickier.
In case of Nvidia you can use some tools from nvidia to enable some performance counters. You can then profile using either some Nvidia tool (or a tool which like ms performance monitor) or gDEBugger. There is also an API which you can use in your own code which gives you some control. The API is about accessing the performance counters.
The AMD GL extension is also about reading performance counters but it really requires GL. I read somewhere that the drivers export about hundred or more channels (and the names can differ per card). On Windows AMD has nice tools but for use on Linux they also recommend gDEBugger.
D3D itself also has a profiling mechanism using PIX using the D3DPERF_ APIs. Nvidia perfkit provides PIX support, it looks like it can query performance counters (I think that's what D3DPERF_BeginEvent is about) but typically these APIs are called by the game. I have seen Force Unleashed making these calls.
Roderick