Jump to ratings and reviews
Rate this book

Transaction Processing: Concepts and Techniques

Rate this book
The key to client/server computing.

Transaction processing techniques are deeply ingrained in the fields of
databases and operating systems and are used to monitor, control and update
information in modern computer systems. This book will show you how large,
distributed, heterogeneous computer systems can be made to work reliably.
Using transactions as a unifying conceptual framework, the authors show how
to build high performance distributed systems and high availability
applications with finite budgets and risk.

The authors provide detailed explanations of why various problems occur as
well as practical, usable techniques for their solution. Throughout the book,
examples and techniques are drawn from the most successful commercial and
research systems. Extensive use of compilable C code fragments demonstrates
the many transaction processing algorithms presented in the book. The book
will be valuable to anyone interested in implementing distributed systems
or client/server architectures.

1128 pages, Hardcover

First published September 1, 1992

24 people are currently reading
510 people want to read

About the author

Jim Gray

3 books7 followers
James Nicholas "Jim" Gray (born January 12, 1944; lost at sea January 28, 2007; declared deceased May 16, 2012) was an American computer scientist who received the Turing Award in 1998 "for seminal contributions to database and transaction processing research and technical leadership in system implementation."

Gray was born in San Francisco, California, the second child of a mother who was a teacher and a father in the U.S. Army; the family moved to Rome where Gray spent most of the first three years of his life, learning to speak Italian before English. The family then moved to Virginia, spending about four years there, until Gray's parents divorced, after which he returned to San Francisco with his mother. His father, an amateur inventor, patented a design for a ribbon cartridge for typewriters that earned him a substantial royalty stream.

After being turned down for the Air Force Academy he entered the University of California, Berkeley as a freshman in 1961, paying $67 per semester. To help pay for college he worked as a co-op for General Dynamics, where he learned to use a Monroe calculator. Discouraged by his chemistry grades, he left Berkeley for six months, returning after an experience in industry he later described as "dreadful." Gray earned his B.S. in Engineering Mathematics (Math and Statistics) in 1966.

Gray pursued his career primarily working as a researcher and software designer at a number of industrial companies, including IBM, Tandem Computers, and DEC. He joined Microsoft in 1995 and was a Technical Fellow for the company when he was lost at sea.

Gray contributed to several major database and transaction processing systems. IBM's System R was the precursor of the SQL relational databases that have become a standard throughout the world. For Microsoft, he worked on TerraServer-USA and Skyserver.

He assisted in the development of Virtual Earth. He was also one of the co-founders of the Conference on Innovative Data Systems Research.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
24 (52%)
4 stars
16 (34%)
3 stars
6 (13%)
2 stars
0 (0%)
1 star
0 (0%)
Displaying 1 - 3 of 3 reviews
Profile Image for Vasil Kolev.
1,131 reviews198 followers
July 31, 2012
It took me a while to read through, but I enjoyed every moment of it. The only thing I can compare it to is when I read "TCP/IP Illustraded vol.1" by W. Richard Stevens - it was extremely interesting and eye-opening.

The book describes transaction processing - not just the transactions in a database, but basically any kind of transaction with ACID properties, that includes all kinds of actions, including "real" ones (moving rods in a nuclear reactor, dispensing money from an ATM), either a local or distributed. There's a good amount of code in it (although the coding style was a bit ugly, probably to save space, but writing directly after the "{" and closing "}" on the same line will always look bad), that supplements the explanations.
(most of the figures are useless, but some can help a bit)

The initial chapters on the basics, models, etc. are something that should be taught at most universities and to all IT people. The chapter on isolation (locking, etc.) is probably the best description of the problem I've seen, and although there has been a lot of research on lock-less schemes and there is a preference for them these days, the same basics haven't changed and still need to be used in the same way.

The last few chapters describe something very close to the design of a transactional database (excluding the SQL parser and some other pieces). Those answer a lot of questions on how does the database work underneath and how does it handle the different data organization cases.

It's a book by people who have written such systems, have seen, debugged and solved the problems seen, and written papers on it. At some point I found 20 pages that described problems and solutions that have taken me months to get through.

I'd recommend a few of the chapters even for non-technical people, because of the insight they can give into computing.
19 reviews2 followers
June 9, 2021
This book is a classic for you to know how relational (and other) databases are built. If you want to build a new database for whatever reason start here. The principles of LLR (Locking Logging and Recovery) are beautifully described here. Its something that I feel a serious computer science student with interest in databases should start taking a stab at.
Displaying 1 - 3 of 3 reviews

Can't find what you're looking for?

Get help and learn more about the design.