Share via


How to correctly initialize a UNICODE_STRING

In the past if you wanted to delcare a UNICODE_STRING and buffer on the stack, you had to manually declare the buffer and then initialize all the fields in the UNICODE_STRING, basically something like this

#define SOME_SIZE

WCHAR stringBuffer[SOME_SIZE];
UNICODE_STRING string;

string.Buffer = stringBuffer;
string.Length = 0x0;
stirng.MaximumLength = sizeof(stringBuffer);

but this is tedious at best.  Furthermore, I have seen many instances where Length was precomputed but MaximumLength was left as uninitialized garbage or zero!  This sometimes works, sometimes does not.  When the UNICODE_STRING is an input parameter, an unitialized MaximumLength might be OK if the function only uses the Length field for its functionality.  When the UNICODE_STRING is an output parameter, the MaximumLength definitely needs to be valid, otherwise the function will fail immediatley or walk off into unknown memory, writing merrily as it goes.  As a case in point, MaximumLength must be set for all output UNICODE_STRINGs in ntstrsafe.h.

To alleviate the tedious repetition of declaring a buffer and a UNICODE_STRING, 2 new macros were added to the WDK (and shipped downlevel in KMDF's wdfstring.h for previous DDKs).  These new macros declare and initialize the structure automatically.  Here they are (with the pragmas removed)

#define DECLARE_CONST_UNICODE_STRING(_var, _string) \
const WCHAR _var ## _buffer[] = _string; \
const UNICODE_STRING _var = { sizeof(_string) - sizeof(WCHAR), sizeof(_string), (PWCH) _var ## _buffer }

#define DECLARE_UNICODE_STRING_SIZE(_var, _size) \
WCHAR _var ## _buffer[_size]; \
UNICODE_STRING _var = { 0, _size * sizeof(WCHAR) , _var ## _buffer }

The previous example rewritten would look like this:

DECLARE_UNICODE_STRING_SIZE(string, SOME_SIZE);

These 2 macros are nice, but they don't help in one important aspect.  Declaring and initializing a const global UNICODE_STRING.  To solve that, another macro was added

#define DECLARE_GLOBAL_CONST_UNICODE_STRING(_var, _str) \
extern const __declspec(selectany) UNICODE_STRING _var = RTL_CONSTANT_STRING(_str)

So to declare a const global UNICODE_STRING, you would add the following to your header file

DECLARE_GLOBAL_CONST_UNICODE_STRING(DeviceNamePrefix, L"\\Device\\Foo");

Comments

  • Anonymous
    April 15, 2011
    This is awesome. I wish I had found it sooner.

  • Anonymous
    September 11, 2013
    It's the "correct way" for you.  Not for everybody.  For me it's the stupid way because it's 482 times more verbose than it needs to be.