There's this algorithm I'm working on. I hacked it together originally, then realized that it would eat about 5 gigs of RAM if I wrote it the easy way. Obviously this is a Bad Idea, so I came up with some optimizations and brought it down to 300mb, then some more to get it down to 150mb. At this point the big bulk of the memory usage was in two equal chunks, so I picked the one I hadn't worked at much to be my next target. After getting rid of some massive redundancies I'd brought it from 70mb down to 20mb, then down to 10mb. Now I just implemented some basic RLE compression, which I *think* will work well - this particular data really doesn't need to be accessed *often* - and it's down to, oh, 300k in most cases.
If I did it in ten-point Arial, in hexadecimal, it would be 133 printed pages.
The computer's tearing through this data like it's nothing, and I'm so seperated from the data itself that I can't read it. I can't look at it myself to make sure it's correct - it wouldn't *mean* anything to me. I have to teach the computer how to do its own sanity checks.
Most of the time this stuff doesn't surprise me, but once in a while . . . this is a strange, strange profession I'm in.