In some code I am going through, one function calls these #defines while packing words. I have played around with them but can't figure out exactly how they work. Could someone help me understand how these work? Thanks.
This code assumes an unsigned short is exactly 2 bytes and it should be called on 4 byte variables. This is not guaranteed by the standard. Also this is only valid for little endian systems. So this code is very much platform dependent.
This code just interprets the variable l as a 2 byte variable, this will get you the first 2 bytes or the low word.
((unsigned short)(((unsigned long)(l) >> 16)))
This code shifts the variable 2 bytes to the right, this will bring the the most significant 2 bytes to the least significant 2 bytes and then does the same trick as before.
Then remains the question, what does the & 0xFFFF do, well as far as I can tell nothing and it's entirely obsolete, this is a binary and of a 4 byte number with a 4 byte number of all ones, which will always give the original number. (i.e.: 0100 & 1111 = 0100)
This post has been edited by Karel-Lodewijk: 06 February 2012 - 04:28 PM