I've had numerous posts here describing instances where C# has come so close to getting it right, and yet misses the mark by an inch. The original C# enums have a simple semantics:
enum SomeEnum
{
First, // compiler implicitly assigns 0
Second, // compiler implicitly assigns 1
Third, // compiler implicitly assigns 2
Fourth, // compiler implicitly assigns 3
}
This worked nicely as a concise expression of a need for a set of distinct of values, but without caring what they are. C# later introduced the [Flags] attribute, which signals to the compiler that a particular enum isn't actually a set of disjoint values, but a set of bit flags. However, the compiler doesn't actually change its behaviour given this change of semantics. For instance, the following enum is completely unchanged, despite the semantically meaningful change to a set of bitwise flags:
[Flags]
enum SomeEnum
{
First, // compiler implicitly assigns 0
Second, // compiler implicitly assigns 1
Third, // compiler implicitly assigns 2
Fourth, // compiler implicitly assigns 3
}
SomeEnum.First
is now not a valid flag, and SomeEnum.Fourth
is now equivalent to (SomeEnum.Second | SomeEnum.Third)
. The native enum behaviour is useless as a set of enumerated flags.
I suspect most people would argue that if you're interested in bitwise flags, you generally need to explicitly specify what the flag values ought to be. I don't think this is true. In fact, the only places where this is true is where the flags are defined in an external system, for instance read/write flags when opening file streams.
Exactly the same argument could be levelled against the native enum behaviour too, but like native enums provide a default correct semantics for when you don't care, so flags should provide a default correct semantics for when you don't care.
Comments