A shallow grip on neural networks (What is the "universal approximation theorem"?)

Published 2024-04-23
The "universal approximation theorem" is a catch-all term for a bunch of theorems regarding the ability of the class of neural networks to approximate arbitrary continuous functions. How exactly (or approximately) can we go about doing so? Fortunately, the proof of one of the earliest versions of this theorem comes with an "algorithm" (more or less) for approximating a given continuous function to whatever precision you want.

(I have never formally studied neural networks.... is it obvious? 👉👈)

The original manga:
[LLPS92] M. Leshno, V.Y. Lin, A. Pinkus, S. Schocken, 1993. Multilayer feedforward networks with a non-polynomial activation function can approximate any function. Neural Networks, 6(6):861--867.

________________
Timestamps:

00:00 - Intro (ABCs)
01:08 - What is a neural network?
02:37 - Universal Approximation Theorem
03:37 - Polynomial approximations
04:26 - Why neural networks?
05:00 - How to approximate a continuous function
05:55 - Step 1 - Monomials
07:07 - Step 2 - Polynomials
07:33 - Step 3 - Multivariable polynomials (buckle your britches)
09:35 - Step 4 - Multivariable continuous functions
09:47 - Step 5 - Vector-valued continuous functions
10:20 - Thx 4 watchi

All Comments (19)
  • @connor9024
    It’s t-22 hours until my econometrics final, I have been studying my ass off, I’m tired, I have no idea what this video is even talking about, I’m hungry and a little scared.
  • Came for the universal approximation theorem, stayed for the humor (after the first pump up I didn't understand a word). Great video!
  • @gbnam8
    as someone who is really interested in pure maths, i think that youtube should really have more videos like these, keep it up!
  • @dinoscheidt
    5:07 phew, this channel is gold. Basic enough that I understand whats going on as an applied ML engineer, and smart enough that I feel like I would learn something. Subscribed.
  • @decare696
    This is by far the math channel with the best jokes. Sadly, I don't know any Chinese, so I couldn't figure out who 丩的層化 is. Best any translator would give me was "Stratification of ???"...
  • @antarctic214
    To 6:20. The secant line approximation converges at least pointwise. But for the theorem we want to construct uniform/sup-norm convergence, and I don't see why that holds for the secant approximation.
  • @Baer2
    I don't think I'm part of the target group for this video ( i have no idea what the fuck you are talking about) but it was still entertaining and allowed me to feel smart whenever I was able to make sense of anything ( I know what f(x) means) so have a like and a comment, and good luck with your future math endeavors!!
  • @98danielray
    I suppose I can show the last equality of 9:04 using induction on monomial operators?
  • @kuzuma4523
    Okay but has the manga good application? Does it train faster or something?😊 (Please help me I like mathing but world is corrupting me with its engineering)
  • your linguistic articulation is extremely specific and 🤌🤌🤌
  • @korigamik
    Can you share the pdf of the notes you show in the video?