A UUID (or even a GUID - one of its implementations) is essentially a number, between 0
and 2^128
[-1]. Ideally, it should always be treated as a number, and mostly compared as a number. Convert a UUID to string, just to show the user in a convenient way. However, when you start type Uuids the thing starts to get complicated...
I don’t know, historically speaking, if the GUID came first and the UUID came to standardize it, or if it was the other way around. But the fact is, yes, Microsoft not only uses uppercase letters in a GUID like uses them inconsistently. This was probably done at a time when there was no standard or, as is typical of Microsoft, simply ignoring the standards. And then kept what it had, for compatibility. I do not know the situation with Apple.
If your system will receive as input - from the user or another system - a text-based UUID, I suggest dealing with all possible variations (all lower case, all upper case, mixed, with or without the {}
around, etc.) and always taking care with coding (encoding) of the text. From there, use a number even (if feasible), and always make any comparison using numbers, not text. If you need to provide a UUID as output, use the default format. So there is a good chance that the consumer of this UUID will "understand" the format - given that if you use something non-standard, this can be rejected or even worse, duplicated (goes that the system does not check this, and saves the same UUID in two versions, one capitalized and one lowercase...).
After all, the "importance" of following the pattern is precisely to avoid interoperability errors. Not knowing if the systems that will interact with your "normalize" the Uuids before sending/after receiving, can not guarantee anything. The ideal is to see with each specific system how this treatment is done, but in the absence of that, follow the pattern is the path with more chance of success.
thanks for the reply... My implementation guarantees the evaluation in lowercase and uppercase... Because as said is a hexadecimal value. My question goes more towards the indication whether the string should follow the pattern and others do not...which you partly answer me. These companies wouldn’t do it without a purpose... that might have nothing to do with programming or standards... you know?
– chambelix
I understand, but unless someone has more information that I do not know, I believe that there is no reason not. Only lack of standardization even. i.e. It started with the form X, it was standardized as Y, but now it has this lot of code and data using X that would give the biggest trouble to adapt, better leave as it is - since it makes little difference... At least that’s how I see it.
– mgibsonbr