Поделиться через


What’s wrong with this code–a real world example

I was working on a new feature earlier today and I discovered that while the code worked just fine when run as a 32bit app, it failed miserably when run as a 64bit app.

If I was writing code that used polymorphic types (like DWORD_PTR) or something that depended on platform specific differences, this wouldn’t be a surprise, but I wasn’t.

 

Here’s the code in question:

         DWORD cchString;
        DWORD cbValue;
        HRESULT hr = CorSigUncompressData(pbBlob, cbBlob, &cchString, &cbValue);
        if (SUCCEEDED(hr))
        {
            cbBlob -= cbValue;
            pbBlob += cbValue;

            if (cbBlob >= cchString)
            {
                //  Convert to unicode
                wchar_t rgchTypeName[c_cchTypeNameMax];
                DWORD cchString = MultiByteToWideChar(CP_UTF8, 0, reinterpret_cast<LPCSTR>(pbBlob), static_cast<int>(cchString), 
                                                      rgchTypeName, ARRAYSIZE(rgchTypeName));
                if (cchString != 0 && cchString < ARRAYSIZE(rgchTypeName))
                {
                    //  Ensure that the string is null terminated.
                    rgchTypeName[cchString] = L'\0';
                }
                cbBlob -= cchString;
                pbBlob += cchString;
            }
        }

This code parses a ECMA 335 SerString. I’ve removed a bunch  of error checking and other code to make the code simpler (and the bug more obvious).

When I ran the code when compiled for 32bits, the rgchTypeName buffer contained the expected string contents. However, when I ran it on a 64bit compiled binary, I only had the first 6 characters of the string. Needless to say, that messed up the subsequent parsing of the blob containing the string.

Stepping into the code in the debugger, I saw that the cchString variable had the correct length at the point of the call to MultiByteToWideChar, however when I stepped into the call to MultiByteToWideChar, the value had changed from the expected length to 6!

After a couple of minutes staring at the code, I realized what was happening: Through a cut&paste error, I had accidentally double-declared the cchString local variable.  That meant that the cchString variable passed to the MulitByteToWideChar call was actually an uninitialized local variable, so it’s not surprising that the string length was bogus.

So why did this fail on 32bit code but not on 64bit?  Well, it turns out that the 32bit compiler’s code generation algorithm had re-used the stack storage for the inner and outer cchString variables (this is safe to do because the outer cchString variable was not used anywhere outside this snippet), thus when the processor pushed the unitialized cchString variable it happened to push the right value.  Since the 64bit compiler allocates local variables differently, the uninitialized variable was immediately obvious.

Comments

  • Anonymous
    November 28, 2012
    DWORD cchString = MultiByteToWideChar( ... cchString ...); Since you use cchString in the call,  you will have received blah.cpp: warning C4700: uninitialized local variable 'cchString' used

  • Anonymous
    November 28, 2012
    @Jon: I expected to see a warning as well. My guess is that something (maybe the cast?) caused the uninitialized local variable error to be suppressed. Or it's possible that my default build environment suppresses the warning.

  • Anonymous
    November 28, 2012
    > So why did this fail on 32bit code but not on 64bit? Don't you mean fail on 64-bit and succeed on 32-bit?

  • Anonymous
    November 28, 2012
    Larry!  Great to see you blogging again!!

  • Anonymous
    November 28, 2012
    ; Larry!  Great to see you blogging again!! Same here. It's been a long time not see you blog here.

  • Anonymous
    November 28, 2012
    In C#, you'd also get: error CS0136: A local variable named 'cchString' cannot be declared in this scope because it would give a different meaning to 'cchString', which is already used in a 'parent or current' scope to denote something else Of course, in C# you'd seldom need to manually calculate storage...

  • Anonymous
    November 28, 2012
    The comment has been removed

  • Anonymous
    November 29, 2012
    @JM: Yes, but this was built from the NT build system - our debug builds are built with full optimization enabled - this ensures that our debug and release builds are as close to identical as possible - the only difference between the two is the asserts in the debug build. And yes, I'm working with the compiler folks to figure out why a warning wasn't generated - there are other uninitialized local variable errors that are generated.

  • Anonymous
    November 29, 2012
    This is also an argument for turning on warnings for "shadow"d variables/functions. That's caught a number of bugs in some of my code before.

  • Anonymous
    November 30, 2012
    The GCC compiler has -Wshadow for catching this kind of thing. This StackOverflow article has a couple of interesting tidbits on the topic: stackoverflow.com/.../is-there-an-equivalent-of-gccs-wshadow-in-visual-c "Check out warnings C6244 and C6246 But you will need to enable automatic code analysis to get them."

  • Anonymous
    December 06, 2012
    "our debug builds are built with full optimization enabled" @Larry, Doesn't this make it harder to debug crashes?

  • Anonymous
    December 09, 2012
    Do you not compile with /analyze? (implied by Ramesh's post)

  • Anonymous
    December 12, 2012
    The comment has been removed

  • Anonymous
    January 03, 2013
    Prefast would find that a declaration is hidden by a nested declaration of the same name.

  • Anonymous
    January 03, 2013
    And, by the way, using reinterpret_cast is a bad habit. It's even worse than using a C-style cast indiscriminately.