For those who aren't coders, I'll give a brief explanation. When you write a program, you write basically a big long (sort-of) text file with instructions to the computers. It is, of course, entirely possible to write instructions that don't make any sense. A lot of mistakes - the kind that result in a program that simply isn't logically meaningful, i.e. "turn the middle end topwise", result in compiler errors. Your computer yells at you, you fix it, you move on.
Another large group of mistakes result in a perfectly reasonable program that simply doesn't do what you want. There's no real way to fix these besides lots of testing. This is why your programs crash and do other weird things.
A third, much smaller, group of mistakes are things that can be detected at compiletime but aren't necessarily problems. For example, I could write code that comes out to "If x is equal to 1, go north. Otherwise, if x is equal to 1, go south. Otherwise, go east." Now, obviously it's possible to follow these instructions. If x is 1, you'll go north. Otherwise, you'll go east. *Probably* this isn't what the coder meant, because there's that "go south" in there that can never occur. On the other hand, the coder *might* have done that for some reason. (I can think of a few reasons this would be vaguely sensible, but I won't go into them right now.)
When confronted with glitches in that style, a lot of compilers will spit out a warning. "Hey, you sure you meant this? Really sure? Okay then, on your own head be it." It won't actually stop you from running the program. It'll just gripe a little bit and move on.
A lot of people don't like this behavior.
They have the philosophy that if the code is questionable, well, the coder didn't *really* mean that, did they? I mean maybe they did, but we'd better be really careful and not let the programmer do it.
Even if they meant to.
Even if they fully understand the problem involved.
This, incidentally, is, in large part, the philosophy that gave us Java.
So there's a pair of flags in GCC (a common compiler) called -Wall and -Werror. -Wall says "activate all your standard warnings, even the really obscure ones". And -Werror says "Oh, and treat every warning as an error." Yes you heard me right - every single thing that used to be a warning will now not let you run your program. Even if you meant it that way.
I, personally, loathe this.
Sometimes I want to write something awful. Something truly horrendous. I want to have variables that are never used, because the code that uses them is just removed temporarily. I want to have functions that don't return the right datatype, because, look, if it's going down that execution path I've got it set up to crash anyway. I want to use functions without having the compiler second-guessing me at every step.
It looks like you're writing a function! Would you like some help with that?
And so I end up doing work to avoid the compiler's warnings. This is a fantastic wonderful idea, roughly akin to "Well, the computer kept warning me that I had a virus, so I turned the virus scanner off." Unfortunately the compiler forces me to do that. In this case, there's actually no way for me to use a library (libpng, for the curious - and you now know nothing more than you used to, I use libpng all the time for debug output) without either:
(1) Inadequate error checking
(2) Horrible hacks to make it compile without warnings
(3) Removing -Werror and -Wall
3 I can't do because those are enforced by the Google build system. 1 doesn't sit right with me. So 2 it is - I now have all my local variables in that function declared as volatile, and if you're not cringing at the moment, you're not a C/C++ programmer.
You can't save programmers from themselves. It doesn't work. Sure, you might make a dangerously incompetent programmer apparently productive for a few days, but trust me, his code will blow, and you'll regret it.
So why can't people just get out of the way and let the coders work?
(Footnote: This is by no means limited to Google. Honestly, Google is one of the best I know of - most companies are even more berzerk. Snowblind, on the other hand, had no such restrictions - we left the code at the default warning level, and dammit, if we ended up with warnings, we ended up with warnings. That's just the way it worked. And we had competent coders, and so it worked great.)
(For the technically adept and curious: the problem lies in the way libpng handles errors. Being a C library, it's perfectly safe for it to use setjmp/longjmp. Unfortunately longjmp has a rather odd thing in its description where, upon rollback, it is not forced to restore local register variables, only local non-register variables. In a completely separate part of the standard, local auto variables are allowed to be stored in registers - so the long and the short of it is, it spits out a warning for every local auto variable in a function with setjmp. The only ways to fix this are to remove local variables entirely - i.e. multiple functions - or to make them volatile and force them out of the registers. You can see how wonderful both these ideas are. Adding insult to injury, the particular line of code concerned is "if( setjmp( png_jmpbuf( png_ptr ) ) ) assert( 0 );", and so you can see how much potential for weird results there is at that point - namely, squat. Especially since I #undef NDEBUG at the beginning of every file - another thing I can't talk the build tools into, creating a "debug optimized build" with -O2 and no -DNDEBUG.)