EA Durbin wrote:
Okay what am i misunderstanding?, explain it to me as its imperative I learn, and I'd love to learn.
%u is an unsigned integer which is 0 to +32,767.
%i is a signed integer –32,767 to +32,767.
If the sequence number is always going to be a positive number why should we allot it the extra 32,767 value range?
"A signed int can hold all the values between *INT_MIN* and *INT_MAX* inclusive. *INT_MIN* is required to be -32767 or less, *INT_MAX* must be at least 32767. Again, many 2's complement implementations will define *INT_MIN* to be -32768 but this is not required.
An unsigned int can hold all the values between 0 and *UINT_MAX * inclusive. *UINT_MAX* must be at least 65535. The int types must contain *at least* 16 bits to hold the required range of values.
*NOTE:* /The required ranges for signed and unsigned int are identical to those for signed and unsigned short. On compilers for 8 and 16 bit processors (including Intel x86 processors executing in 16 bit mode, such as under MS-DOS), an int is usually 16 bits and has exactly the same representation as a short. On compilers for 32 bit and larger processors (including Intel x86 processors executing in 32 bit mode, such as Win32 or Linux) an int is usually 32 bits long and has exactly the same representation as a long."/
AFAIK, all signed and unsigned versions of the same type actually use the same number of bits, it's just that by using one of those bits as the 'sign' bit, the signed version seems to only be able to hold values half as big as the unsigned version (in fact, they're both capable of storing the same number of unique values, it's just that for the signed version half of the 65535 values are below zero).
As a tip to remember this, consider for example an unsigned char. A char is just a byte, which is the smallest unit of memory most computers can address. An 'unsigned' char wouldn't yield any space savings because the minimum you can allocate at once is a byte anyway (in fact on most modern systems much larger than that, a page, but C is abstracted away from that). So unsigned/signed types must take up the same amount of memory. (we could concoct our own language where unsigned ints are actually shorts, but we'd be going out of our way)