This week’s book is The Master Alogirthm by Pedro Domingos. I wanted to read a machine learning focused book and this was named as one of the best introduction level book about the subject. And it sure was a great introduction level book, because it didn’t turn around and went right into the subject. It was obvious this man knew what he was talking about.
Pedro Domingos is a professor at the University of Washington, in Computer Science. He won the prestigious SIGKDD Innovation Award. He is also a fellow of the Association for the Advancement of Artificial Intelligence. This book received positive feedbacks from other famous IT writers so I thought, hey! this one will be good.
All in all, the brillance of this book is that it explains in simple, humoristical and yet professional words the science that is machine Learning. He even uses Frank Abagnale Jr.’s imitation skills as examples! Going from Hume’s Problem of Induction to the “no free lunch” theorem, passing by the famous volate/frequency curve, Darwin’s algorithm (and the importance of speed in the say algorithm), Bayes’ Theorem, Markov logic networks, and much more, this books is a real perfect introduction book to the world of ML. The Master Algorithm sounds far from our reality, but this book demystifies it beautifully.
Here are what I thought were the most interesting parts of the book:
Some possible algorithms as the Master Algorithm that didn’t work out
- Microprocessor (a huge computer seen as a big algorithm)
- NOR gate
- statistical package
- U(F)=0, (any equation equal to 0)
The Five Tribes of Machine Learning
- Symbolists: manipulating symbols
- Replacing équations by other équations
- Need initial knowledge about the data
- How to incorporate exisiting data and combine knowledge on the fly
- Inverse induction
- Reinverse engineers
- Learning is like the brain
- Adjust strengths of connections between neurons
- Find connections that may cause errors
- Compares output with desired result and change the connections in the layers of neurons in consequence
- Natural selection
- Simulate what created us on a computer
- Solve learning structure
- Create the brain that the adjustements can fine-tune to
- Generic programming
- Mates and evolves the computer programs like in nature
- Uncertain/Probabilistic inferance
- How to deal with Noisy/incomplete/contradictory informations
- Use Bayes’ Theorem and its dérivatives
- Incorporates new beliefs in the most efficient way
- Recognizing similarities between situations and thereby inferring other similarities.
- Judge how similar two things are.
- Support vector machine
- Figure out which experimences to remember and how to combine them.