Convex optimization problems arise frequently in many different fields. A comprehensive introduction to the subject, this book shows in detail how such problems can be solved numerically with great efficiency. The focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them. The text contains many worked examples and homework exercises and will appeal to students, researchers and practitioners in fields such as engineering, computer science, mathematics, statistics, finance, and economics.
First, note that as of 2006 you could get a pdf of this book for free on Stephen Boyd's website. So that's worth an extra star right there.
I learned convex optimization out of this book, and I use it as a reference. In particular, I like chapter 3 on convex functions, and chapter 2 on convex sets. They contain all the basic results in a compact but easy to read form. They also cover quasi-convexity in a comprehensive way, which I don't believe any of the other standard texts do. Chapter 5 on duality theory is also pretty good. The rest of the book I am less fond of -- a couple chapters on applications (although the statistical estimation chapter is interesting in places, as is function fitting), and then discussion of the algorithms. The algorithm section is probably pretty good if you often need to know details about interior point methods, but I don't. The appendices summarize some of the less well-known nooks of linear algebra commonly used in convex optimization.
There are a lot of really nice exercises, but beware: there are some really difficult ones mixed in, so if you get stuck it might be because the problem is wicked hard and not because you are stupid. Also, some of the exercises assume a certain amount of linear algebra wizardry. There are a finite number of tricks, so once you learn them you are set, but if you go in without having someone to ask questions of, you could leave a lot of problems unfinished.
Fantastic book on convex optimization. I wouldn't say I really "read" it from cover to cover, but rather used it as an accompaniment to Boyd's lectures (https://www.youtube.com/watch?v=McLq1...), which directly follow the book. Boyd does a great job of both giving an intuitive understanding to various parts of the subject and a rigorous handling of it, usually through visuals and some proofs for interested readers. His explanation and coverage of duality were especially noteworthy. The exercises were also really well done, both comprehensive and reasonably challenging.
The book is divided into three main sections: basic theory, applications, and numerical optimization (i.e. implementation) details. I worked the first two pretty closely, but mostly just skimmed the third section. The parts I worked through were similarly well laid out and explained, although for people interested in the numerics specifically, Numerical Optimization by Nocedal seems like a better book. Great book overall!
I really liked the authors' style of explaining through examples and intuition; it made the concepts easy to remember and apply to my own problems. Highly recommend the two-part lecture series as well.
Something of a coffee table book for convex optimization in particular and nonlinear programming in general. It's a good introduction to the happy cases, but doesn't give much help for understanding or reasoning about the more common non-convex cases.
Boyd's technical writing is unrivaled in its clarity, in my view, and motivated much of my writing as a professional scientist communicating mathematics. This is his seminal work (though Boyd's Introduction to Applied Linear Algebra is also a must read), and it shows. The book is partitioned into three parts (Theory, Applications, and Algorithms), and opens with a brief but surprisingly insightful bird's eye view of mathematical optimization per se. Boyd's views here I think have withstood the test of time, and are not in conflict with the onslaught of nonlinear programming methods popularized in modern machine learning (which are often now the only methods taught and/or understood by students in computer science departments). If one reads this book cover-to-cover, I feel confident she comes away with a perspective much like the following, which I think is true to Boyd's personal views, and much of the ethos of this book.
In many, many applications of interest, framing it as a convex optimization problem garners an enormous improvement in effectiveness (let's say, the first 85% in improvement of your problem's objective, as a semi-arbitrary number). It is possible to improve even further with nonlinear programming methods, bespoke application-specific insight, etc., but very often a strong foundation in the methods contained in this book reward the applied mathematician with significant improvements over non-systematic methods (hand-designed solutions from earlier eras, unjustified heuristics, etc.). There are surely problems that do not exhibit this character (maybe next-token prediction, although it would be interesting to see the reduction in performance using a convex method), but this is a good general rule for the practicing scientist/engineer.
This book will enrich your understanding of a vast array of techniques split across dozens of fields (with their own idiosyncratic terminology): statistical inference, information theory, signal processing, electrical engineering, machine learning, etc. In many ways this books contains much of the "kernel" that underlies many related ideas across all of these fields. One leaves with a sort of liberated and unified view of much of what is going on, and strong intuitive ideas that allow one to predict when certain techniques should be expected to work (and how to justify that they do).
The first section on theory is in a word: tight. That is not to say, terse, or lacking in generosity (in fact, Boyd is one of the most generous mathematical writers), and it is in fact extremely possible to understand with just a basic familiarity with calculus and linear algebra. Many well made figures and examples clarify the core ideas, and the final chapter on Duality is a real Tour de Force that beautifully simplifies the blasted Langrange multipliers taught in high school calculus, the KKT optimality conditions, and importantly, the trifecta of the analytic, geometric, and practical relationship between the primal and the dual.
The second section on applications is extremely empowering. After spending a few weeks fighting hard through section one (not because of dryness or poor writing, but because of the depth and subtlety of the ideas), this is the payoff, and it is sweet. Approximation and fitting (with a side tour into probability theory and statistical interpretations), is a better treatment of the topic than is taught in most other field-specific works (e.g., on statistical inference or machine learning, though some signal processing texts do ok). Statistical estimation is similar, and really served to simplify and unify much of my views around statistical inference, hypothesis testing, and detector design. The geometric problems are both fascinating and extremely useful in practice.
The final section on Algorithms, is, as flagged in the introduction, not a treatment of SOTA methods or serious computational considerations. Boyd writes that "we have chosen just a few good algorithms, and describe only simple, stylized versions of them (which, however, do work well in practice)" which is true to form. I was able to spin up a simple (but reasonably reliable) interior point method just from the content in this book, but a serious implementation would require more support and backing from Golub's Matrix Computations, a copy of Numerical Recipes, etc. This is not a downfall and is perfectly well advertised within the first few pages of the book. The section absolutely delivers on its promise of introducing the important ideas underlying how these algorithms work.
If a scientist or engineer can tolerate only a single "mathematically-oriented" book, this is a great choice. The fact that it's freely available online, with a truly priceless lecture collection posted on YouTube (where one can enjoy the uniquely powerful teacher that Boyd is, and benefit from his side commentary and stand-up comedy), don't weigh toward the quality of this book, but should be noted nonetheless. Boyd is nothing short of the GOAT.
(๑•̀ㅂ•́)و✧ The standard regarding optimisation theory is Stephen Boyd's Convex Optimisation. The pedagogical clarity of this book presents the subject to the reader with unrivaled accuracy, ranging from basic considerations of convex sets over duality theory and KKT conditions to interior-point methods. Thus, the convexity of economic problems stands at the heart of mechanism design, finance, and even machine learning; it is a must-read by a broad spectrum of scholars.
The Lagrangian duality framework is especially enlightening:
Boyd also provides MATLAB implementations, making this text practical for both theorists and applied researchers. However, it is more engineering-focused, so economic readers may prefer to pair it with Romer's Advanced Macroeconomics to see applications in economic growth.
A true delight. Five stars. Would optimise again. ✧٩(•́⌄•́๑)
Last year I picked it up, thinking about finishing it, but I couldn’t. This time I’m reading it chapter wise and I’m almost halfway through it. It’s rigorous- unless you do the back exercises, don’t expect to fully grasp the subtleties involved in identifying convexity. If you’re mathematically inclined and have a good background in linear algebra, vector calculus and probability, you’ll able to relate with most stuff and find some of intuitive too. For ML researchers who’re interested in developing new architectures, you may find this quite enlightening, so to speak.
This is a very friendly book on convex optimization, explaining almost every line of the derivation. Exercises are a bit harder (considering how easy the text), but solutions are available online, which helps a lot. This book doesn't cover much of convex analysis or modern convex optimization like stochastic optimization, although that's understandable considering the target audience of this book.
A good book for learning the fundamentals of CVX Optimization, but not the right book if you want to learn how to solve convex problems.
Part I is good, but I am not a fan of Parts II and III. Boyd describes barrier-based primal interior point methods well, but does not do a great job in explaining primal-dual interior point methods (which seem to be much more common in practice).
This book brings me both joy and misery. The book is what I would consider the definition of “verbose” should be :)). Sometimes it could be a bit difficult to understand because Prof. Boyd keeps re-explaining everything. But when you need this verbosity, this is got-sent! Definitely gonna reread this many times in the foreseeable future!
Good book on the basics of optimisation, but lacks a section on stochastic methods, and on modern optimisation techniques like variance reduction and non-smooth problems
Amazing and free text on convex optimization. There is also a two semester Stanford course on YouTube which follows this book closely, and is taught by one of the authors (Boyd). The applications section was slightly less relevant to my own interests in this subject, but the first and last half serve as excellent and very readable references. Highly recommended to anyone who wants to dive into this branch of mathematics.
Beyond the specific topic area, this book excels in the problem-solving methodology employed by the author. , The applications can be useful in a broader array of subject areas.