http://bugs.winehq.org/show_bug.cgi?id=22064
--- Comment #49 from Mikko Rasa tdb@tdb.fi 2010-08-29 09:12:21 CDT --- Created an attachment (id=30464) --> (http://bugs.winehq.org/attachment.cgi?id=30464) A program that emulates UGL connecting sequence
Using +secur32 output and MSDN I made a program that emulates the connecting sequence of the Ubisoft Game Launcher. Output in wine:
InitializeSecurityContext status 590610 Attributes 0 Output buffer contains 93 bytes Received 1282 bytes InitializeSecurityContext status 590610 Attributes 0 Output buffer contains 310 bytes Received 43 bytes InitializeSecurityContext status 0 Received 26 bytes DecryptMessage status 0 Decrypted data (5 bytes): 00 00 00 00 30
Output in windows:
AcquireCredentialsHandle status 0 InitializeSecurityContext status 590610 Attributes c01c Output buffer contains 77 bytes Received 1282 bytes InitializeSecurityContext status 590610 Attributes 801c Output buffer contains 310 bytes Received 43 bytes InitializeSecurityContext status 0 Received 26 bytes DecryptMessage status 0 Decrypted data (5 bytes): 17 03 01 00 15
There are several differences: attributes reported by InitializeSecurityContext, the size of the initial packet, and most importantly, the actual data that is received. Obviously the launcher isn't going to be happy if it gets the wrong data from the server. Could we be using the wrong cipher or something?