This book introduces the reader to the basic math used for neural network calculation. This book assumes the reader has only knowledge of college algebra and computer programming. This book begins by showing how to calculate output of a neural network and moves on to more advanced training methods such as backpropagation, resilient propagation and Levenberg Marquardt optimization. The mathematics needed by these techniques is also introduced. Mathematical topics covered by this book include first, second, Hessian matrices, gradient descent and partial derivatives. All mathematical notation introduced is explained. Neural networks covered include the feedforward neural network and the self organizing map. This book provides an ideal supplement to our other neural books. This book is ideal for the reader, without a formal mathematical background, that seeks a more mathematical description of neural networks.
Not really an introduction to the mathematical theory underlying neural networks but rather a walk through an example with figures of how a simple neural network is set up, assigned weights and how those weights are updated under a few different learning algorithms.
Useful for a layman interested in the nuts and bolts of how neural networks operate or a programmer who might want to play around with neural networks for fun but only a very small step forward for anyone wanting to develop a real world use based or a firm theoretical grounding in the area. Given the discount purchase price this was worthwhile for me.
The book does provide some useful pointers to other resources on the topic and the author's website has some excellent articles . It may well be worth getting the other books in this series if you are interested in this topic from a hobby perspective or are just starting out.
Como uma introdução básica, esse livro facilita os primeiros passos no entendimento do mecanismo das redes neurais. Para mim ele atingiu os objetivos que eu tinha que era uma noção geral de qual o mecanismo matemático por trás de uma rede neural, como os neurônios são "disparados" ou não, como a rede neural é treinada (atualização dos pesos), qual a diferença básica (em termos matemáticos) de uma rede não supervisionada de uma supervisionada.
The book falls somewhat short of Heaton's goal of drawing an unbroken line from the target audience (algebra-proficient computer programmers) to the subject matter, but it was a pretty good attempt. I don't think anyone is going to fully understand this book without separately studying derivatives and matrix conversions prior to reading.
But if you're into it, make sure you have your Wikipedia open to help you unpack statements like "The LU decomposition takes the Hessian, which is a matrix of the second derivatives of the partial derivatives of the output of each of the weights... ...if you have never heard the term 'second derivative' before, the second derivative is the derivative of the first derivative." ...Oh, so that's it. Got it. ;)
That said, this is a good no-filler overview of the 'under-the-hood' math behind neural networks, describing quite well the functional advantages of various machine learning paradigms.
Good overview of the mathematics behind Neural Networks. Much better than most books I have read on it. For a more broad overview, I suggest reading the user guide for Encog. Yes, the user guide to a framework is good enough to learn from...
This book mostly lives up to its description of being accessible to those with highschool math and whom are actively in CS. It served as a great reintroduction to certain math equations for me and to be honest a bit nostalgic. At times however there are spikes in the difficulty of whats covered. Dont make this the first book on Neural Networks you read, but make sure its on of them.
Jeff done a great job on elaborate the complexity of Math in Artificial Neural Network, especially for someone who barely know what Neural Network is. He explain why certain formula exists and why one is better to certain cases than the other.
Very interesting, I had to read through it twice and review much of the math. IT was interesting and well put together. I just found I wanted a bit more! Worth the read if you are interested in the field!