When Shannon first derived his famous formula for information, he asked von Neumann what he should call it and von Neumann replied “You should call it entropy for two reasons: first because that is what the formula is in statistical mechanises but second and more important, as nobody knows what entropy is, whenever you use the term you will always be at an advantage!
It may be apocryphal of course. But it is quoted in various books by Miron Tribus who knew Shannon, by Georgescu-Roegen and others.
My own view follows many others and that is that entropy is perhaps the only candidate for a measure of complexity, largely because it reflects a trade-off between the shape of a probability distribution and the number of events describing it: as the number of events increases, entropy increases, inter alia, while as the distribution becomes more and more uniform and thus less and less ordered, entropy increases, thus linking the formula to the idea that highly chaotic randomly distributed systems have high entropy and highly ordered ones low entropy.
Some of this is in my recent 2010 paper Space, Scale, and Scaling in Entropy-Maximising, Geographical Analysis, 42, 4, 395–421, 2010, Full-text PDF size: 253 Kb and in my 1974 paper on Spatial Entropy in the same journal – not online but I will put it online soon as a scan when I get back to the ‘smoke’ from the ‘Big Apple’ where I am blogging this. But an excellent and interesting recent paper is The Universality of Zipf’s Law (2010) by Bernat Corominas Murtra, and Ricard Solé in the Arxiv and also in Physical Review E who imply that not only is entropy the right measure for complexity, but the power law is the most appropriate distribution characterising a complex system.
Martin Austwick has an interesting piece on defining complexity which is what I am trying to do in my Spatial Complexity course. If you click on the link to the left or here, then I have a small comment about entropy and complexity that is attributable to the perhaps apocraphyal exchange in 1948? between Shannon and von Neumann.