Jorge Brasil: Before Machine Leaning Vol. 1
It often happens to me that I get lost when working through math textbooks. As a non-mathematician, I find it very difficult to cope with the affinity of many authors to prove every theorem and to be as abstract and pure as possible. It is clear that mathematics can't be done without precision and that a superficial treatment is not enough.
One particular book that I have read recently is Jorge Brasil's Before Machine Learning Vol. 1 - Linear Algebra.1 It motivated me in a special way to get back into the theory of linear algebra. The book summarized in one sentence: It's a speedrun through linear algebra to get specifically to the use cases of SVD (single value decomposition) and PCA (principal component analysis). No more and no less. In 164 pages, the author succeeds in conveying the most important points in an understandable and sometimes humorous way and explaining them using examples.
The book begins with a compact introduction to vectors, vector operations, the dot product and vector spaces. The fourth chapter is then devoted to matrices, linear transformations, determinates and the "Eigen Stuff” (eigenvalues and eigenvectors). Both chapters comprise around 100 pages and read like an exciting story. Well-chosen graphics make it easier to grasp the theory quickly.
Chapters three and four lay the foundation for the following two chapters: Single Value Decomposition (SVD) and Principal Component Analysis (PCA). These are not trivial topics that require a lot of experience in dealing with matrices. I can't say that I could reproduce everything after reading the book. But in my opinion this is not the aim of the book, otherwise it would probably need many exercises and more examples. Instead, the author manages to motivate you to deal more with the theory, but at the same time to create an understanding of how linear algebra is used in the field of machine learning.
Apart from minor typographical errors, this book is highly recommended for people who want an introduction (or re-introduction) to linear algebra. On Reddit the author writes "I wrote these books to resemble a story rather than a traditional textbook, presenting concepts in context to avoid isolation."2 This is exactly what the author has achieved, it motivates you to engage more with the theory and to focus on a context/application.
Normally I wouldn't link to Amazon, but unfortunately the website mldepot.co.uk mentioned in the book is currently down.