Why don't most web spiders handle gzip?

Lately the web spider Pompos has been eating up bandwidth on my site. Watching the logs, I noticed that many of the major bots seem to ignore the fact that I am using modgzip to compress my data and use less web server bandwidth. So far this month Pompos has eaten 20MB (I’m thinking about <a href=“//boonedocks.net/mike/index.php?/archives/26MSNBOT.html”>blocking it) and Googlebot has downloaded 8MB. If they’d accept compressed data, it would be a lot less.

Thanks to Yahoo Slurp, which does accept gzip-encoded content.