A shallow grip on neural networks (What is the "universal approximation theorem"?)
4,881
Published 2024-04-23
(I have never formally studied neural networks.... is it obvious? 👉👈)
The original manga:
[LLPS92] M. Leshno, V.Y. Lin, A. Pinkus, S. Schocken, 1993. Multilayer feedforward networks with a non-polynomial activation function can approximate any function. Neural Networks, 6(6):861--867.
________________
Timestamps:
00:00 - Intro (ABCs)
01:08 - What is a neural network?
02:37 - Universal Approximation Theorem
03:37 - Polynomial approximations
04:26 - Why neural networks?
05:00 - How to approximate a continuous function
05:55 - Step 1 - Monomials
07:07 - Step 2 - Polynomials
07:33 - Step 3 - Multivariable polynomials (buckle your britches)
09:35 - Step 4 - Multivariable continuous functions
09:47 - Step 5 - Vector-valued continuous functions
10:20 - Thx 4 watchi
All Comments (19)
-
It’s t-22 hours until my econometrics final, I have been studying my ass off, I’m tired, I have no idea what this video is even talking about, I’m hungry and a little scared.
-
Came for the universal approximation theorem, stayed for the humor (after the first pump up I didn't understand a word). Great video!
-
as someone who is really interested in pure maths, i think that youtube should really have more videos like these, keep it up!
-
super underrated channel
-
5:07 phew, this channel is gold. Basic enough that I understand whats going on as an applied ML engineer, and smart enough that I feel like I would learn something. Subscribed.
-
I have no idea what I just watched
-
This is by far the math channel with the best jokes. Sadly, I don't know any Chinese, so I couldn't figure out who 丩的層化 is. Best any translator would give me was "Stratification of ???"...
-
Did not expect to see a Jim's Big Ego reference here
-
To 6:20. The secant line approximation converges at least pointwise. But for the theorem we want to construct uniform/sup-norm convergence, and I don't see why that holds for the secant approximation.
-
I don't think I'm part of the target group for this video ( i have no idea what the fuck you are talking about) but it was still entertaining and allowed me to feel smart whenever I was able to make sense of anything ( I know what f(x) means) so have a like and a comment, and good luck with your future math endeavors!!
-
@6:22 missed opportunity for the canonical recall from gradeschool joke
-
Great content
-
banger vid
-
Any videos coming about Kolmogorov Arnold networks?
-
I suppose I can show the last equality of 9:04 using induction on monomial operators?
-
Okay but has the manga good application? Does it train faster or something?😊 (Please help me I like mathing but world is corrupting me with its engineering)
-
your linguistic articulation is extremely specific and 🤌🤌🤌
-
Can you share the pdf of the notes you show in the video?