Jump to ratings and reviews
Rate this book

The Nature of Statistical Learning Theory

Rate this book
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: * the setting of learning problems based on the model of minimizing the risk functional from empirical data * a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency * non-asymptotic bounds for the risk achieved using the empirical risk minimization principle * principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds * the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: * the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation * a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of statistical learning theory, and the author of seven books published in English, Russian, German, and Chinese.

334 pages, Paperback

First published December 14, 1998

12 people are currently reading
254 people want to read

About the author

Vladimir N. Vapnik

3 books7 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
18 (52%)
4 stars
9 (26%)
3 stars
6 (17%)
2 stars
0 (0%)
1 star
1 (2%)
Displaying 1 - 4 of 4 reviews
Profile Image for Michiel.
383 reviews90 followers
May 7, 2013
Well, it's a book about the mathematical foundation of statistical learning, so it is not an easy read. Gives an interesting overview in the theory behind support vector machines and how they can be applied for classification, regression and density estimation. I liked the philosophical intermezzos.
Profile Image for Wojciech.
3 reviews2 followers
April 17, 2022
An absolute must read for anyone who wants to learn about machine learning and/or artificial intelligence. The very source of the notion of machines "learning", a beautiful, strict, mathematical concept that allowed the whole field to be formed. Even if modern day deep learning goes away from its SLT roots, reading it should be obligatory element to getting a phd in a field.
1,621 reviews22 followers
January 18, 2020
Not light reading :D

I was just looking through this yesterday and thinking how Vapnik's ideas about generalization might still be relevant in thinking about how to develop "Strong AI".
Displaying 1 - 4 of 4 reviews

Can't find what you're looking for?

Get help and learn more about the design.