This textbook introduces a science philosophy called "information theoretic" based on Kullback-Leibler information theory. It focuses on a science philosophy based on "multiple working hypotheses" and statistical models to represent them. The text is written for people new to the information-theoretic approaches to statistical inference, whether graduate students, post-docs, or professionals. Readers are however expected to have a background in general statistical principles, regression analysis, and some exposure to likelihood methods. This is not an elementary text as it assumes reasonable competence in modeling and parameter estimation.
I was once told that no one should use AIC for model selection without reading this book first and I definitely can’t disagree, having finally done so myself. A great introduction to AIC and the statistical philosophies behind it and similar concepts.
My second recent 'for-fun' textbook. I thought this was extremely well written. Anderson makes a concise case for using mathematical models as tools for inferring structure in biological processes. He simplifies statistical theory in palatable concepts which seem very tractable to apply. I picked this up to supplement my understanding of information theoretic metrics for comparing different models and found that this text gave me new ideas to test in my project.