ok(NetApiBufferSize(p, &dwSize) == NERR_Success, "Got size"); - ok(dwSize >= 0, "The size"); + ok(dwSize < 0x80000000, "The size"); [...] ok(NetApiBufferAllocate(0, (LPVOID *)&p) == NERR_Success, "Reserved memory"); ok(NetApiBufferSize(p, &dwSize) == NERR_Success, "Got size"); - ok((dwSize >= 0) && (dwSize < 0xFFFFFFFF),"The size of the 0-length buffer"); + ok(dwSize < 0xFFFFFFFF, "The size of the 0-length buffer");
I think this test does not make sense. Why would 0xfffffffe be considered valid and not 0xffffffff? The truth is we have no idea what value to expect from a call to NetApiBufferAllocate(0,...) and thus there is nothing to test.
The < 0x80000000 test just above seems just as arbitrary (just a better approximation of 'if this were a signed int would it be positive?').
So I vote to just remove both tests, or, for the first case, to actually find out what value we should expect (but I doubt this is documented).