Jacek Caban (@jacek) commented about dlls/urlmon/urlmon_main.c:
if (dwReserved || !szURL) return E_INVALIDARG;
- return S_OK;
- const size_t len = wcslen(szURL);
- if (len >= MAX_URL_SIZE)
return S_FALSE;
- if (StartsWithProtocol(szURL, len, L"https") ||
StartsWithProtocol(szURL, len, L"http") ||
StartsWithProtocol(szURL, len, L"ftp") ||
StartsWithProtocol(szURL, len, L"file") ||
StartsWithProtocol(szURL, len, L"mailto") ||
StartsWithProtocol(szURL, len, L"mk") ||
I think that we should use pluggable protocols to try to parse the URL. See the attached [test hack](/uploads/ef350910b23dc3455576e5324d63f35f/test.diff). It fails on Windows showing how it's supposed to work: it queries IInternetProtocolInfo interface of protocol handler and calls ParseUrl(PARSE_CANONICALIZE) on it. Maybe we could just use `CoInternetParseUrl` for that and see if it fails, I'm not sure without more tests.