Thou shalt have no other gods before the ANSI C standard 1381
To this I don't totally agree. In my experience, if some code needs 32-bit ints and cannot work with 36-bit ints, then this code is relying on the implicit "modulo 2^32". Implicit truncations tend to be less readable than explicit ones. I often use code like that:
* in some common header *typedef unsigned int u32; #define C32(x) x ## U #else typedef unsigned long u32; #define C32(x) x ## UL #endif #define T32(x) ((x) & C32(0xFFFFFFFF))
* ... somewhere in SHA-1 code * t = T32(rotl(a, 5) + F(b, c, d) + e + wi + K1); e = d; d = c; c = rotl(b, 30); b = a; a = t; }* ... *
The idea is that I really want to make it explicit that I truncate to 32 bits. My type "u32" is actually a type with "at least 32 bits" and the code above tries to match it to a type with exactly 32 bits when available. An enhanced version would test STDCVERSION and, in and uin32t (if available) or uintleast32t.
Thou shalt have no other gods before the ANSI C standard 1382
Randy Howard And sometimes that kind of decision is rational and makes economic sense. Let me give you an example. Suppose...
I do think that the presence of that "T32" enhances code readability. It also makes the code compatible with 36-bit architectures, whether they exist or not. This is a side effect, a "fine addition".
Alt Folklore Computers Newsgroups