This article is kind of lame. 1 - 6 should be basic skills from undergrad CS courses. Also the intro paragraph is a bait and switch,
"Instead of performing some operation (such as counting the 1 bits in an integer) by looping over individual bits, these programming nuggets do the same with one or two carefully chosen bitwise operations."
We never actually learn how to count set bits in an integer. (My initial thought was look-up table but is there a better way?)
Indeed. I go to Northwestern University, but I know for a fact that our first lab in Intro to Computer Systems is the same as the one at Carnegie Mellon. Basically, you're given a C file and you have to implement about 20 instructions using only specific bitwise operations and only a certain number of them. The fun part is that the number of instructions you used is posted on a website, along with the "professor's" result so you can compete against other people in the class to do it in the quickest possible fashion, but also against the professor's score.
Basically, if you can't figure these out for yourself, I'm not sure you're a very good programmer.
There is a divide and conquer strategy, that will come in advanced bithacks article.
I agree 1-5 are basic. That's why I say at #6 "Now it finally gets more interesting!!! Bit hacks #1 - #5 were kind of boring to be honest." I included them for completeness.
Here is method for counting one bits:
int one_bits(unsigned int x) {
x = x - ((x >> 1) & 0x55555555);
x = (x & 0x33333333) + ((x >> 2) & 0x33333333);
x = (x + (x >> 4)) & 0x0F0F0F0F;
x = x + (x >> 8);
x = x + (x >> 16);
return x & 0x0000003F;
}
"Instead of performing some operation (such as counting the 1 bits in an integer) by looping over individual bits, these programming nuggets do the same with one or two carefully chosen bitwise operations."
We never actually learn how to count set bits in an integer. (My initial thought was look-up table but is there a better way?)