This book is an introduction to information and coding theory at the graduate or advanced undergraduate level. It assumes a basic knowledge of probability and modern algebra, but is otherwise self- contained. The intent is to describe as clearly as possible the fundamental issues involved in these subjects, rather than covering all aspects in an encyclopedic fashion.
The first quarter of the book is devoted to information theory, including a proof of Shannon's famous Noisy Coding Theorem. The remainder of the book is devoted to coding theory and is independent of the information theory portion of the book. After a brief discussion of general families of codes, the author discusses linear codes (including the Hamming, Golary, the Reed-Muller codes), finite fields, and cyclic codes (including the BCH, Reed-Solomon, Justesen, Goppa, and Quadratic Residue codes). An appendix reviews relevant topics from modern algebra.
This book is "information theory-light" (approximately 120 pages) and "coding theory-heavy" (approximately 300 pages).
The book covers many families of codes and this is definitely its strength. In light of the series title, "Graduate Texts in Mathematics", and in view of it being published by Springer-Verlag, this text is not an "easy read". It likely works best as a strong reference for encoding/decoding theory in a general context of information theory.
If you are a beginner in coding theory in mathematics and computers, I would suggest this book to get acquainted with the nuts and bolts. Obviously, this text is crucial for graduate students, professors and post-docs in the area. If you are a beginner in information theory, I would suggest another book!