https://bugs.winehq.org/show_bug.cgi?id=42324
--- Comment #4 from Ken Thomases ken@codeweavers.com --- At the point where the Mac driver initializes OpenGL, queries its extensions, and enumerates its pixel formats, it uses the primary display. However, when it creates an OpenGL context for rendering, it doesn't tie it to any specific display[*]. That should result in a context which automatically switches to the appropriate GPU as the window is moved.
In Apple's terminology, the context changes which "virtual screen" is active for the context. In theory, a Mac app should note when the context's virtual screen changes, re-query the GL extensions and limits, and reconfigure its rendering pipeline accordingly. However, Windows OpenGL apps are not prepared to do that. In theory, D3D apps should cope if WineD3D would tell them that the device was "lost", but the Mac driver doesn't have a good way to tell WineD3D that and, to my knowledge, WineD3D doesn't currently support marking devices as lost.
What are the actual symptoms you're seeing? Is it just that the benchmarks are reporting the name of the GPU as "Intel HD4000"? Or are they actually running *really* slowly (more slowly, even, then when the window is on the internal display)? Also, what precisely do you mean that this "basically breaks games and other GPU heavy windows programs"? How are they broken?
Also, what happens if you change the primary display to the external? In System Preferences > Displays > Arrangement, drag the representation of the menu bar from the internal display to the external.
[*] In theory, the caller could request that the context be tied to a specific renderer by including the WGL_RENDERER_ID_WINE attribute in the attribute list passed to wglCreateContextAttribsARB(). However, WGL_RENDERER_ID_WINE is a Wine-specific attribute that no caller other than WineD3D would know to use, and WineD3D does not actually use it.