Scaling Neural Tangent Kernels via Sketching and Random Features

Credit: Kevin Wang

Date
Feb 1, 2022 6:00 PM — 7:00 PM

Kevin presented Scaling Neural Tangent Kernels via Sketching and Random Features, which uses sketching and random feature generation to speed up neural tangent kernels (NTKs). This presentation was really all about introducing the NTK, a mechanism for analyzing the behavior of very wide / infinitely wide neural networks. NTKs made a huge splash in machine learning theory in 2018 for offering a novel approach to analyzing the behavior of neural networks, and there’s plenty of ground left to cover with them in research.

Supplemental Resources

Kernel trick
Professor Feizi’s lecture
Rajat’s blog post
Paper #1, introduces NTKs
Paper #2, polynomial bounds NTK complexity and introduces CNTK
Paper #3, enhanced CNTK

Related