Freigeben über


Internet Explorer is not an HTTP Validator

Question:

I tried a 2nd approach in porting client code from WinInet, and that was to utilize managed C++, as opposed to WinHttp.

After implementing the .NET managed client code...
HttpWebRequest^ myReq = dynamic_cast<HttpWebRequest^>(WebRequest::Create( strTargetURL ));
myReq->Method = "POST";
...

HttpWebResponse^ HttpWResp = dynamic_cast<HttpWebResponse^>(myReq->GetResponse());

Within the code, GetResponse() throws...

An unhandled exception of type 'System.Net.WebException' occurred in System.dll
Additional information: The server committed a protocol violation. Section=ResponseStatusLine

On the Windows 2003 R2 SP1 server, the ISAPI extension has been simplified to the following...

DWORD WINAPI HttpExtensionProc(EXTENSION_CONTROL_BLOCK *pECB)
{
DWORD dwPageSize, dwBytes;
char szPage[] = "We're good to go";
dwPageSize = (DWORD) strlen(szPage);

return ( pECB->WriteClient(pECB->ConnID, (LPVOID) szPage, &dwPageSize, 0) ) ? HSE_STATUS_SUCCESS : HSE_STATUS_ERROR;

}

Any help would be apreciated enabling more clients whether they are WinHttp or .NET clients, the ability to POST to the ISAPI extension.

Thanks,

Answer:

Actually, the problem you observe comes from your ISAPI Extension DLL. It is actually sending an invalid HTTP response without an entity body, and the APIs correctly complain. Let me explain.

Problem Restated

My understanding of your facts:

  1. You have an ActiveX DLL using WinInet which successfully POSTs to an ISAPI Extension
  2. You have a Windows Service which fails to POST to the exact same ISAPI Extension using either WinHttp or .NET Client

Based on that information, I hypothesize either:

  1. The ISAPI Extension or other Server Side ISAPI Filter is doing something wrong, but WinInet ignores or covers it up while WinHttp/.NET Client correctly fail
  2. The ISAPI Extension or other Server Side ISAPI Filter is doing something wrong, but IIS/ISAPI/Windows Networking stack does a special hack for WinInet but not others like WinHttp or .NET Client

Personally, I am more inclined to believe that the former is happening. The latter is simply way too convoluted, difficult, and fragile. I am not a Conspiracy Theorist - I believe in straight-forward explanations for computer issues. :-)

Problem Confirmed

In this case, I am certain that #1 applies for several reasons:

  1. The simplified ISAPI Extension does not send a proper HTTP Response
  2. "WinInet accepts the response" does NOT mean "it is a proper HTTP response"

In other words, the ISAPI Extension is doing something bad, but WinInet (and IE, which uses WinInet) tries very hard to succeed and hence overlooks such errors.

<rant>

Yes, it gives a great user experience to see IE render all sorts of improper HTTP responses and HTML pages, but it also gives false user confidence in the correctness of such HTTP/HTML.

But, I do not think the problem is solely with Microsoft/IE but rather shared with the authors of such bad HTTP and HTML. Computer users expect things to magically "work", so someone has to make the broken things be "less broken" and "work". This expectation results in a viscious cycle:

If IE refused to render the broken HTTP/HTML but some other browser did, then users think that IE is broken instead of blaming the incorrect web page. Since IE renders more broken HTTP/HTML, web page developers have less motivation to author correctly... and IE will be punished for refusing to render that future broken web page.

How whacky is that!?! Of course, Users have no idea that this is going on - they only see rendered pages and think everything is alright - when in fact the browser and web developers are slowly diverging from published specifications, increasing their maintenance costs, and causing headaches on the development side of things. And all because we are trying to shield the End User...

This downside is what hits you right now. In this case, you see a response from the ISAPI Extension when browsed with Internet Explorer or WinInet, so you probably think the ISAPI is perfectly simple and correct. Hence, you think that Microsoft has a bug somewhere that either causes WinHttp or .NET Client to not work with the ISAPI, or that there is some devious hack somewhere to favor WinInet. Bad Microsoft.

But in reality, it is the ISAPI Extension that is broken, and you were fooled by the tainted validations with Internet Explorer and WinInet.

</rant>

Troubleshoot with Trusted Tools

This is why I only use the following basic but trusted tools to debug HTTP Client/Server issues... because they have no alternative agenda to mislead anyone:

  • WFetch to make raw requests and observe raw responses
  • Network Monitor to tap the network and observe raw requests and responses
  • Native Code Debuggers to observe programmatic state inside a process

I simply do not trust debugging/troubleshooting with anything else. If I have to use something like IE/FireFox, I always treat its answer with a heavy grain of salt and not as Gospel.

Resolution

If you change the ISAPI code to send the following response, then I believe it should work for WinHttp and .NET Client as well as WinInet:

 char szPage[] = "HTTP/1.1 200 OK\r\n"
                "Content-Type: text/html\r\n"
                "Content-Length: 16\r\n"
                "\r\n"
                "We're good to go";

Basically, your ISAPI only sent back "We're good to go" as an HTTP response, and that is improperly formatted. The fix makes the ISAPI send back a proper HTTP response, so the client APIs like WinHttp or .NET Client should just work.

Conclusion

Powerful APIs, like ISAPI Extension and ISAPI Filter, directly control the data stream to/from IIS. Thus, they can either positively augment IIS behavior or negatively manipulate IIS to misbehave.

In particular, they differ from programming environments like ASP, ASP.Net, or PHP, which removes some of the power to protect the user from generating common HTTP mistakes. One just has to be aware of the guard-rails and training wheels.

Like many things, it is just a tradeoff that one needs to be aware of; nothing right or wrong.

//David

Comments

  • Anonymous
    August 18, 2006
    Hi David,

     What about Fiddler? I've found fiddler a must-have tool for web developers. The "alert on protocol violations" feature helped me a lot to catch bad http responses.
     I go with you on the matters of current "web adevelpment", altought I charge other reasons for that.  In my opinion, after the web's boom, a the "development community" move they're attention to web development, but without learning how and why it's protocols and standards build it.
     Today the WEB term is too much bounded with UI, when in fact its strenght resides in its simple but effective protocol for object access. WEB was readed as acronym for HTML and web servers saw as "front end" servers.
     With the rapdly growing of webservices, whose applications are not UI driven, the WEB term is again bounded with the "HTTP protocol"  and web servers is rapdly being perceived as "communication servers" or "service providers".
     Now, since you mention your favorite tools for debugging, I'll left a suggestion: why not make a "top 10" tools for HTTP developers?
    In my list, I'll change "Network Monitor" for WireShark and add Fiddler and NetCat.

    Cheers,

    Eric.

  • Anonymous
    August 18, 2006
    Hi David,

    Awfully nice of you to aid on this.

    It sure "felt" like the port was a move towards the less forgiving.

    I guess as a C/C++ programmer I have always been an advocate of that. I believe that with utilizing the headers correctly will permit better communication efforts through the many routers, proxies and firewalls.

    Thank you,

    - Mike

  • Anonymous
    August 18, 2006
    Eric - I do not like Fiddler because it is not trusted and low-level enough.

    It installs itself as a Proxy to IE to filter its request/responses, which alters the network pattern/behavior of IE. It is therefore unable to troubleshoot large classes of networking problems with IE.

    Meanwhile, Network Monitor passively monitors all network traffic so it can diagnose anything network related. It just requires more skill to use and interpret it.

    To each their own preference. I agree that Fiddler is more approachable for most Web Developers, but I think that really seasoned Web Developers should graduate to use Network Monitor... ;-)

    I prefer WFetch over NetCat because it handles Authentication and SSL at an HTTP level. For HTTP Developers, WFetch is a great compliment to Fiddler... but I still prefer my three tools because they do not fail to allow me to debug the exact networking issue.

    //David

  • Anonymous
    August 18, 2006
    mike - yeah. I believe in doing the right thing saves more time in the long run. :-)

    //David

  • Anonymous
    March 25, 2010
    Let's rewrite your "rant argument" from the perspective of an HTML page author (me). If I followed HTML/CSS standards (therefore breaking my web page on IE), then users think that my web page is broken instead of blaming the incorrect browser. Since I break web standards, web browser developers (e.g., Microsoft) have less motivation to conform to web standards... and web standards and other browsers will be punished. A vicious circle, indeed.

  • Anonymous
    May 28, 2010
    The comment has been removed

  • Anonymous
    May 28, 2010
    The comment has been removed