watermark logo

22 Views· 22 July 2022

DeepMind's AI for Mathematics Breakthrough Explained

Advertisement


taylajeffcott
Subscribers

DeepMind recently published a paper in Nature proposing a new paradigm for how machine learning can be used to help mathematicians pose new conjectures and ultimately prove new theorems. This methodology was used to establish breakthrough results in mathematics by discovering new connections in knot theory (a relationship between algebraic and geometric knot invariants) and making significant progress on a 40 year old conjecture in representation theory. In this video, I explain in depth the knot theory result by providing a soft introduction to the knot theory, a rapid crash course in supervised machine learning, and then diving into the paper.

#DeepMind #AI #math #machine #learning #knot #theory

The paper: "Advancing mathematics by guiding human intuition with AI"
https://www.nature.com/articles/s41586-021-04086-x

Acknowledgments: Alex Davies, Andras Juhasz, Marc Lackenby generously offered their consultation for the production of this video. (Of course, any shortcomings of the video are due to me).

Disclaimer: This is not an official DeepMind product.

Further introductory reading:
C. Adams. The Knot Book
N. Buduma. Fundamentals of Deep Learning

Corrections and Notes:
00:03:22 : S^1 denotes a circle
00:16:28 : For geometric invariants, I meant to say introduce auxiliary structures to understand knots (not to understand those geometric structures themselves)
00:22:07 : Technically, a torus does not have a "hole in it". The surface is smooth and the "hole" is just a visual description of its shape in 3D space.
00:25:41 : Lambda is called the "longitudinal translation". Mu is called the "meridional translation".
00:53:40, 00:54:52, 00:56:00 : Oops misaligned paste over with ipad background being shifted up.

Timestamps:
00:00:00 : Introduction
00:03:07 : Knot Theory Basics
00:08:53 : When are two knots equivalent?
00:10:44 : Knot invariants
00:13:10 : Why care about knots? (examples: math, physics, biology)
00:15:10 : Fundamental question motivating knot theory work: Is there a relationship between algebraic and geometric invariants?
00:17:40 : Example: Crossing number
00:20:27 : Signature of a knot
00:21:21 : Slope of a (hyperbolic) knot
00:23:04 : Torus is parallelogram with opposite sides identified
00:25:41 : Definition of slope in terms of maximal cusp torus parameters (lambda and mu)
00:30:58 : Machine learning crash course
00:33:32 : MNIST classification example
00:34:55 : Knot theory classification
00:36:51 : Supervised learning
00:38:04 : Neural network architecture: Fully connected network
00:50:25 : Softmax function
00:54:30 : Maximum likelihood
00:58:27 : Cross entropy loss (negative log likelihood)
00:58:56 : High level summary of neural network training
01:01:09 : Paper deep dive
01:08:44 : Formulation of the right conjecture requires deep understanding of math
01:11:53 : Symbolic regression was not successful
01:14:25 : A theorem finally
01:15:10 : Wrap up

Twitter:
@iamtimnguyen

Webpage:
http://www.timothynguyen.org

Show more


Up next

Advertisement


0 Comments