A rich, narrative explanation of the mathematics that has brought us machine learning and the ongoing explosion of artificial intelligence
Machine-learning systems are making life-altering decisions for us- approving mortgage loans, determining whether a tumour is cancerous, or deciding whether someone gets bail. They now influence discoveries in chemistry, biology and physics - the study of genomes, extra-solar planets, even the intricacies of quantum systems.
We are living through a revolution in artificial intelligence that is not slowing down. This major shift is based on simple mathematics, some of which goes back centuries- linear algebra and calculus, the stuff of eighteenth-century mathematics. Indeed by the mid-1850s, a lot of the groundwork was all done. It took the development of computer science and the kindling of 1990s computer chips designed for video games to ignite the explosion of AI that we see all around us today. In this enlightening book, Anil Ananthaswamy explains the fundamental maths behind AI, which suggests that the basics of natural and artificial intelligence might follow the same mathematical rules.
As Ananthaswamy resonantly concludes, to make the most of our most wondrous technologies we need to understand their profound limitations - the clues lie in the maths that makes AI possible.