次の方法で共有


The Declaration of a Managed Enum

What s Different in the Revised
Language Definition?

The Declaration of a Managed Enum
Type

The original .NET enum declaration
is preceded by the __value keyword.
For example,

 

// the original
syntax

public __value enum e1
: unsigned short {
not_ok = 1024, maybe, ok = 2048 };

__value enum e2
{ fail, pass };

 

This has been replaced with an adjective modification
of the class keyword
in a manner similar to the spaced keyword declarations of the reference and value
classes:

 

// the revised
syntax

public enum class e1
: unsigned short {
not_ok = 1024, maybe, ok = 2048 };

enum class e2
{ fail, pass };

The changes in syntax are minor. The challenge in
both the original and revised language design is the integration of the .NET enum
such that it can be distinguished from the existing enum type, and yet seem analogous.
This requires some qualification of the enum keyword.
Why replace __value enum with enum class?
There are two reasons, one technical and the other aesthetic. The aesthetic reason
is the symmetry of declaring all managed types:

enum class ec;

value class vc;

ref class rc;

interface class ic;

The technical reason is to emphasize that fact that
the .NET enum is a class type exhibiting semantics quite different from those of the
ISO C++ enum. The __value keyword
also is intended to suggest this, indicating that it is a derivation of the abstract
Value type of the .NET class hierarchy, but that association is less clear to those
new to the .NET environment.

In an earlier entry, I made the rather fanciful analogy
of the native and managed worlds as representing a

Kansas

and Oz pair of landscapes, and suggesting that they exhibited different semantics.
I also suggested that designing a language that combines these two physics requires
a sort of two-faced perspective captured by the bust of Janus, the Ancient Roman figure
who is shown in profile with two front heads (I suppose there is a better way of saying
that), one looking east and the other west, and we are free to pin whatever values
we like on those two directions.

When something is said to be poetic in
a computer science discussion, the term is generally pejorative, suggesting the speaker
is being excessively elegant but with little substantial meaning that would sustain
analysis. (Similarly, one might discount someone s argument as being academic,
meaning they are counting angels dancing on the head of a pin that is, the argument
has no practical value and is wasting everyone s time, thank you very much.) Just
as conventional wisdom says that for every mathematical formula in a book targeted
for the lay person half the readership on average are lost, so too each poetical metaphor
or analogy present in a discussion aimed at the programming community causes half
that audience on average to turn away. But in both cases, sometimes it is worth the
risk.

The fact is, the original language design got the
semantics of the .NET enum wrong, and it had to be corrected in the revised language.
For example, consider the following code fragment:

__value enum status
{ fail, pass };

void f(
Object* ){ cout << "f(Object)\n"; }

void f( int ){
cout << "f(int)\n"; }

int main()

{

      status
rslt;

      //
&

      f(
rslt ); // which f is invoked?

}

Through the eyes of

Kansas

, this is a no-brainer. The instance of f() invoked
is that of f(int).
An enum is a symbolic integral constant, and it participates in the standard integral
promotions which take precedence in this case. So,
in the original language design, we pulled a
Kansas
, and imposed

Kansas

physics on the enum element within our .NET Oz. This caused a number of surprises
not when we used them in a

Kansas

frame of mind but when we needed them to interact with the existing Oz framework,
where an Enum is a class indirectly derived from Object. In the revised language design,
the instance of f() invoked
is that of f(Object^).

If you are a C++ programmer, it would not surprise
me if you felt a bit frustrated at this moment. (Others have expressed equal frustration.)
The fact is, within the .NET Oz, many
of the fundamental premises of our

Kansas

physics that we have internalized and take to be second nature do not hold. We
are not at the center of the .NET universe or rather, our language is not at the
center. (This is neither fair nor unfair. It is just how things are.) And this is
part of the difficulty of integrating the .NET paradigm into C++. (We ll see other
examples as we continue our exploration of this space [I m tempted to indulge in
a Star Trek analogy here, but think that would be pushing things a bit right now.])

Another fundamental difference between the

Kansas

and Oz enums is that of scope. The

Kansas

enum does not maintain its own scope. This means that its enumerators spill out into
the enclosing scope. This is a disaster in terms of name-collisions at global scope
or even within namespace scope and so the design idiom within native C++ is to
encapsulate enums within the class declaration for which they supply symbolic meaning.
(I first fielded this problem in the original beta release of the new iostream library
by Jerry Schwarz at Bell Laboratories during the release of cfront 2.0. In that release,
he did not encapsulate all the associated enums defined for the library, and the common
enumerators such as read, write, append, and so on, made it nearly impossible to for
users to compile their existing code. One solution would have been to mangle the names,
such io_read, io_write, etc. A second would have been to modify the language, further
breaking compatibility with the C language, by adding scope to an enum, but this was
not practicable at the time. The middle solution was to encapsulate the enum within
the class, or class hierarchy, where both the tag name and enumerators of the enum
populate the enclosing class scope.) So,
the motivation for placing enums within classes, at least originally, was not philosophical,
but a practical response to the name-space pollution problem.

The Oz enum is a class object within the ValueType
subtree of the .NET class hierarchy, and it maintains scope within which its associated
enumerators are encapsulated. Because of that, the original motivation for placing
enum definitions within a class is absent under .NET, and the namespaces of the Base
Class Library (BCL) are populated by what to a C++ programmer feels like an overabundance
nay, a very plethora of enums. But once you have an encapsulated set of enumerators,
encapsulating the enum no longer makes much sense. So, in general, you wouldn t want
to follow the

Kansas

idiom within your Oz programs. That is, if you are using a .NET enum type, it is not
generally recommended that you nest its definition within a class.

In the original language design, there is an attempt
to support what is referred to as `weak injection of enumerators of .NET enum types.
By weak injection, the designers meant that, all things being equal, the enumerators
will be lifted into the enclosing scope within which the enum is defined. (This is
exactly the semantics people want if they encapsulate enum definitions within their
classes. Otherwise, they have to specify an additional nesting level.) The catch,
of course, is that things cannot be guaranteed to be equal in all circumstances, and
so the facility worked less than perfectly at times. In the revised language design,
weak injection is not supported. Let
s look at an example.

// original
language design supporting weak injection

__gc class XDCMake
{

public:

      __value enum _recognizerEnum
{

           
UNDEFINED,

           
OPTION_USAGE,

           
XDC0001_ERR_PATH_DOES_NOT_EXIST = 1,

           
XDC0002_ERR_CANNOT_WRITE_TO = 2,

           
XDC0003_ERR_INCLUDE_TAGS_NOT_SUPPORTED = 3,

           
XDC0004_WRN_XML_LOAD_FAILURE = 4,

           
XDC0006_WRN_NONEXISTENT_FILES = 6,

     
};

     
ListDictionary* optionList;

     
ListDictionary* itagList;

     
XDCMake()

     
{

            optionList
= new ListDictionary;

           
optionList->Add(S"?", __box(OPTION_USAGE));

           
optionList->Add(S"help", __box(OPTION_USAGE));

           
itagList = new ListDictionary;

           
itagList->Add(S"returns", __box(XDC0004_WRN_XML_LOAD_FAILURE));

     
}

};

// revised
language design note that this is generated by a translation tool

ref class XDCMake

{

public:

      enum class _recognizerEnum

     
{

           
UNDEFINED, OPTION_USAGE, XDC0001_ERR_PATH_DOES_NOT_EXIST = 1,

           
XDC0002_ERR_CANNOT_WRITE_TO = 2,

     
XDC0003_ERR_INCLUDE_TAGS_NOT_SUPPORTED = 3,

           
XDC0004_WRN_XML_LOAD_FAILURE = 4,

           
XDC0006_WRN_NONEXISTENT_FILES = 6

     
};

     
ListDictionary^ optionList;

     
ListDictionary^ itagList;

     
XDCMake()

     
{

           
optionList = gcnew ListDictionary;

           
optionList->Add( S"?", _recognizerEnum::OPTION_USAGE );

           
optionList->Add( S"help", _recognizerEnum::OPTION_USAGE );

           
itagList = gcnew ListDictionary;

           
itagList->Add( S"returns", _recognizerEnum::XDC0004_WRN_XML_LOAD_FAILURE );

     
}

};

In both cases, the original language design erred
in viewing the world in Kansas-colored spectacles, overcompensating in order to make
the native C++ programmer more comfortable in this .NET home away from home. Does
this mean that there should be no attempt at building a bit of

Kansas

in Oz? Hardly. I think you ll agree that the support for finalistic determination,
copy constructors semantics, and so on, are elegantly integrated and make for an innovative
language. (I can say that because I had no part in that work, and so can stand back
and admire it.) It s just that we ve had to learn to choose our battles, and know
when to give way.

disclaimer: This posting is
provided "AS IS" with no warranties, and confers no rights.

Comments