ThatMathThing
ThatMathThing
  • Видео 181
  • Просмотров 607 898
Why the world NEEDS Kolmogorov Arnold Networks
We discuss the importance and contribution that the KAN method accomplishes.
//Watch Next
The Real Analysis Survival Guide ruclips.net/video/v5rD0B-zfXw/видео.html
The Analyticity of the Laplace transform ruclips.net/video/FIMkbFQL6XM/видео.html
Introduction to Control Theory ruclips.net/video/0v4WFmOm764/видео.html
//Papers
KAN Manuscript Ziming Liu et al - arxiv.org/abs/2404.19756
Kolmogorov's Paper - cs.uwaterloo.ca/~y328yu/classics/Kolmogorov57.pdf
//Books
V. I. Arnold - Mathematical Methods of Classical Mechanics - amzn.to/3wMuQWe
Steve Brunton and J. Nathan Kutz - Data Driven Science and Engineering amzn.to/4daHtem
Holger Wendland - Scattered Data Approximation amzn.to/4daHtem
Gregory Fasshaue...
Просмотров: 21 937

Видео

Fourier Series and their Convergence (Proof!) (Theory of Machine Learning)
Просмотров 1,3 тыс.Месяц назад
We give a proof of the convergence of Fourier Series and talk about orthogonal bases. //Watch Next The Real Analysis Survival Guide ruclips.net/video/v5rD0B-zfXw/видео.html The Analyticity of the Laplace transform ruclips.net/video/FIMkbFQL6XM/видео.html Introduction to Control Theory ruclips.net/video/0v4WFmOm764/видео.html //Books Steve Brunton and J. Nathan Kutz - Data Driven Science and Eng...
Reproducing Kernels and Functionals (Theory of Machine Learning)
Просмотров 1,9 тыс.2 месяца назад
In this video we give the functional analysis definition of a Reproducing Kernel Hilbert space, and then we investigate approximations within this space using moments as data. We draw a comparison with polynomial best approximations over L^2, and get comparable results with a new basis function made from kernels. //Watch Next The Real Analysis Survival Guide ruclips.net/video/v5rD0B-zfXw/видео....
Putting DATA in Hilbert Spaces: Proving the Riesz Theorem (Theory of Machine Learning)
Просмотров 2,6 тыс.2 месяца назад
In this video we introduce and prove the Riesz representation theorem for Hilbert spaces. This is the fundamental theorem that enables much of machine learning and data science, since it allows us to embed information and measurements from the real world into a Hilbert function space. //Watch Next The Real Analysis Survival Guide ruclips.net/video/v5rD0B-zfXw/видео.html The Analyticity of the L...
What is a BEST approximation? (Theory of Machine Learning)
Просмотров 3,9 тыс.2 месяца назад
Here we start our foray into Machine Learning, where we learn how to use the Hilbert Projection Theorem to give a best approximation of a function. This way we learn a function using an optimization procedure. To do this, we use the moments of a function. //Watch Next The Real Analysis Survival Guide ruclips.net/video/v5rD0B-zfXw/видео.html The Analyticity of the Laplace transform ruclips.net/v...
The Hilbert Projection Theorem - Full Proof! (Theory behind Machine Learning)
Просмотров 3,5 тыс.3 месяца назад
In this video we embark on the study of spaces of (potentially) infinite dimension, Hilbert spaces. Here we give the full proof the Hilbert Projection theorem, which is the most critical piece of their study. //Watch Next The Real Analysis Survival Guide ruclips.net/video/v5rD0B-zfXw/видео.html The Analyticity of the Laplace transform ruclips.net/video/FIMkbFQL6XM/видео.html Introduction to Con...
when are you too old to learn math?
Просмотров 3,6 тыс.5 месяцев назад
When is it too late to start learning math? And how can you start? To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/ThatMathThing/ . //Books Rudin - amzn.to/3KeL0IC Dummit and Foote Abstract Algebra amzn.to/3QxGvwN Folland Real Analysis amzn.to/3dCuCao Hungerford Algebra amzn.to/3pIJyGV Pedersen Analysis NOW amzn.to/3c27MZo //Watch Next Real Analysis Surviva...
How to get ahead in your math classes
Просмотров 3,9 тыс.5 месяцев назад
So you have the time and you want to get ahead in your math courses. Here I talk about the best resources for doing so, whether you are a beginning learner or a graduate student in mathematics. To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/ThatMathThing/ . //Books Rudin - amzn.to/3KeL0IC Dummit and Foote Abstract Algebra amzn.to/3QxGvwN Folland Real Analy...
Can an AI Learn Physics from Data? (Parameter Identification)
Просмотров 1,6 тыс.6 месяцев назад
It took millennia for us to arrive at the right ideas behind physics. Starting with the Babylonians to the Greeks to Kepler then Newton, and they all made their models from the data that they had available to them. Today, I'll tell you how we can use data to automatically produce physical models through AI concepts, and this approach is called "parameter identification." //Books Brunton and Kut...
Uniform Continuity over Compact Sets
Просмотров 1,3 тыс.8 месяцев назад
Here we talk about continuity and its connections with compactness. Compactness was first defined around the same time as metric spaces, by Maurice Frechet, and they are more or less a generalization of finite sets. Their interactions with continuous functions have opened up the entire field of functional analysis over the past century, so let's look at the interactions between these important ...
Applying for TENURE in Florida's Political Scene
Просмотров 1,1 тыс.8 месяцев назад
I am applying for tenure at a Florida university while Ron Desantis is working to eliminate it. Let's talk about what tenure is, where it came from, and why it is (or isn't) important. To be clear, the statements in this video are made as a private citizen and not in an official capacity as an employee of the University of South Florida. //Books Rudin - amzn.to/3KeL0IC Dummit and Foote Abstract...
Proof and Intuition for the Weierstrass Approximation Theorem
Просмотров 2,8 тыс.8 месяцев назад
This is an in depth look at the Weierstrass Approximation Theorem and the proof that can be found in Rudin's Principles of Mathematical Analysis. This proof resolves the polynomial approximation problem, a moment problem, and can even be seen as previewing Dynamic Mode Decompositions. //Articles Anton Schep - Weierstrass' Proof of the Weiestrass Approximation Theorem people.math.sc.edu/schep/we...
Recommendation Letters: The Good, the Bad, and the Career-Killing
Просмотров 1,5 тыс.8 месяцев назад
Your letters of recommendation are pivotal in any application, but why do they matter in 2023 and how can a single letter ruin your entire career? //Books Rudin - amzn.to/3KeL0IC Dummit and Foote Abstract Algebra amzn.to/3QxGvwN Folland Real Analysis amzn.to/3dCuCao Hungerford Algebra amzn.to/3pIJyGV Pedersen Analysis NOW amzn.to/3c27MZo //Watch Next Real Analysis Survival Guide ruclips.net/vid...
What is Continuity? According to a Mathematician.
Просмотров 2,4 тыс.9 месяцев назад
What is continuity and what is the popcorn function? From primitive notions to the modern day, there have been a variety of attempts at understanding continuity and functions. We talk about them here, in this video. To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/ThatMathThing/ . The first 200 of you will get 20% off Brilliant’s annual premium subscription....
Surviving your PhD
Просмотров 8 тыс.10 месяцев назад
Surviving your PhD
The Riemann Rearrangement Theorem // I can make this sum anything I want
Просмотров 1,7 тыс.10 месяцев назад
The Riemann Rearrangement Theorem // I can make this sum anything I want
INSIDE Oppenheimer’s Secret Home
Просмотров 71710 месяцев назад
INSIDE Oppenheimer’s Secret Home
Using MATH to make fractals in Photoshop
Просмотров 895Год назад
Using MATH to make fractals in Photoshop
An Introduction to Sequences in Real Analysis
Просмотров 1,6 тыс.Год назад
An Introduction to Sequences in Real Analysis
What did we all miss in this 2000 year old problem?
Просмотров 44 тыс.Год назад
What did we all miss in this 2000 year old problem?
The Art of Impromptu Presentations: How to Deliver a Memorable Talk
Просмотров 1,4 тыс.Год назад
The Art of Impromptu Presentations: How to Deliver a Memorable Talk
An Introduction to Cantor and Infinity
Просмотров 1,9 тыс.Год назад
An Introduction to Cantor and Infinity
An Introduction to Compact Sets
Просмотров 9 тыс.Год назад
An Introduction to Compact Sets
What are Metric Spaces and Limit Points?
Просмотров 1,8 тыс.Год назад
What are Metric Spaces and Limit Points?
Nailing the Job Talk
Просмотров 1,2 тыс.Год назад
Nailing the Job Talk
Defining Exponentials the "EASY" and HARD way
Просмотров 1,9 тыс.Год назад
Defining Exponentials the "EASY" and HARD way
Cauchy Schwarz and what to DO with it (corrected) // An Introduction to Real Analysis
Просмотров 3,2 тыс.Год назад
Cauchy Schwarz and what to DO with it (corrected) // An Introduction to Real Analysis
Mastering the Art of Reading Proofs: By Example
Просмотров 12 тыс.Год назад
Mastering the Art of Reading Proofs: By Example
Unlocking the Secrets of the Square Root of 2 in Baby Rudin
Просмотров 3,1 тыс.Год назад
Unlocking the Secrets of the Square Root of 2 in Baby Rudin
Calculus is a STUPID name
Просмотров 1,8 тыс.Год назад
Calculus is a STUPID name

Комментарии

  • @jkjenkins7205
    @jkjenkins7205 3 дня назад

    I was thinking like you. Once I saw the picture. You clearly have something that looks like an integral with limits.

  • @user-fj9hf4bu9f
    @user-fj9hf4bu9f 4 дня назад

    mate you don't need to use up half the screen to show your face typing. you're not photogenic and no one comes here thinking you are, just focus more on the content and reducing the dead air.

    • @JoelRosenfeld
      @JoelRosenfeld 4 дня назад

      @@user-fj9hf4bu9f This is a 3 year old video. Check out my newer stuff. I was just learning how to make videos back then

  • @jblk9669
    @jblk9669 4 дня назад

    In your other video "what did we all miss in this 2000 year old problem" where you explained how the Johnson-Jackson girls proved the Pythagorean theorem using trigonometry. In it you said "let us assume A and B are not equal." Well, that is a hypothesis, so you haven't proven a thing. Then, they began their proof using two right triangles which is basic geometry and you even mentioned algebra. They may have somewhat proved the theorem using trigonometry, but they began by using geometry. One cannot get to the moon w/o going up. You cannot get to 3 w/o going through 2. Without algebra and geometry, there is no trigonometry.

    • @JoelRosenfeld
      @JoelRosenfeld 4 дня назад

      @@jblk9669 What is really important here is that they have a new proof, and they were only in high school when they did it. Whether or not you think the involvement of trigonometry was significant enough, then you are free to make that judgement. Just defining a triangle is not a significant use of geometry in my opinion, and that’s what you need to set the stage. It’s like saying because we use a polynomial when we write down the fundamental theorem of algebra, that the proof using complex analysis must be algebra. Because we had a polynomial, IT HAS TO BE ALGEBRA. But we need to first set the stage for the proof, and then the tools that we use are typically what we talk about when we say we used a different field to prove it.

  • @jblk9669
    @jblk9669 4 дня назад

    You make geometry, my favorite subject, seem unnecessary. So, the girls used only trigonometry to create this so-called new proof? What did they initially start with?, a right triangle, which is geometry. I applaud their effort, but you cannot get to the moon w/o going up. One cannot get to 3 w/o going through 2. 1+1.9 does not equal 3, only 1+2=3.

  • @jblk9669
    @jblk9669 4 дня назад

    In the 60 Mintes video, it was stated that they only used trigonometry to creat this proof. So, they used only trigonometry to create this so-called new proof? What did they initially start with?, a right triangle, which is geometry. I applaud their effort, but you cannot get to the moon w/o going up. One cannot get to 3 w/o going through 2. 1+1.9 does not equal 3, only 1+2=3.

  • @4thpdespanolo
    @4thpdespanolo 7 дней назад

    You’re just going to ignore the Runge phenomenon

    • @JoelRosenfeld
      @JoelRosenfeld 7 дней назад

      @@4thpdespanolo I’ve talked about it in previous videos. This is a short, so there isn’t much time to talk about anything really

  • @mohdherwansulaiman5131
    @mohdherwansulaiman5131 7 дней назад

    I have tried KAN in 2 prediction problems: SOC estimation and chiller energy consumption prediction. It works.

  • @vipinx8881
    @vipinx8881 9 дней назад

    Runge-Kutta just wasn't clicking for me. This video helped a lot. Thanks!

  • @user-eb6mn3dw1v
    @user-eb6mn3dw1v 10 дней назад

    You are nuts! Apostol's Analysis has the hardest exercises ever because it expects you to know things that are not covered in the book or in any calculus book that you've been through! Ever encountered the Vieta's formulas for sums of roots in an ordinary calculus textbook? I don't think so...

    • @JoelRosenfeld
      @JoelRosenfeld 10 дней назад

      Apostol takes more time to go over the details of the proofs than Rudin. It’s a very readable book. Sure, there are hard problems, but there are also a bunch of easier problems. I think the overall number of problems in Apostol are greater than Rudin (don’t have either next to me right now to check though), which gives space for some harder questions like that.

    • @user-eb6mn3dw1v
      @user-eb6mn3dw1v 10 дней назад

      ​@@JoelRosenfeld I have started Rudin's Principles before a month and I'm already on chapter 3, chugging exercises faster than any of those in Apostol's Analysis... I don't know what are you talking about. Would definitely give Apostol another go and I hope you are right, but as far as my experience goes - Rudin is easier than Apostol... MUCH easier.

    • @user-eb6mn3dw1v
      @user-eb6mn3dw1v 10 дней назад

      Also thanks for the output! Appreciate it.

    • @JoelRosenfeld
      @JoelRosenfeld 10 дней назад

      @@user-eb6mn3dw1v all that matters is that you find a book that works for you. I think Rudin is a masterpiece.

    • @user-eb6mn3dw1v
      @user-eb6mn3dw1v 10 дней назад

      @@JoelRosenfeld flexibility is powerful.

  • @andysawyer647
    @andysawyer647 11 дней назад

    A lot of Thales beliefs come from Kemet. The understanding would go on to become hermetic teaching. This possibly came from the Kemetic schools being expressed as exonyms.

    • @JoelRosenfeld
      @JoelRosenfeld 10 дней назад

      interesting! I’ll have to follow up on that

  • @pabloagogo1
    @pabloagogo1 14 дней назад

    Isn't the definition at the beginning of the function where when x=y we get 1, and when x!=y we get 0, is this not a definition of the Kronecker Delta? Just curious.

    • @JoelRosenfeld
      @JoelRosenfeld 14 дней назад

      Yep! When you take inner products between orthonormal vectors you get the kronecker delta function

  • @Lilina3456
    @Lilina3456 15 дней назад

    Hii thank you so much for the video. Can you please do the implementation of KANs?

    • @JoelRosenfeld
      @JoelRosenfeld 14 дней назад

      That’s the plan! Grant writing and travel have slowed me down this summer, but it’s coming. I have three videos in the pipeline and that I think is the third one.

  • @cecilarthur3579
    @cecilarthur3579 16 дней назад

    Unfortunately I solved it... Check bottom of post for certain explanations This Is To Eliminate Numbers that dont need to be Checked: Given Arithmetic progression, x to be all numbers, x => 1,2,3,4,5,... Eliminating all odd numbers, leaves 2x => 2,3,4,5,... Removing all numbers divisible by 4 [a] rewrites the equation to 4x-2 => 2,6,10,... [b] Inserting into the congecture, leaves 2x-1 => 1,3,5, 7,... [c] Infinite Elinimination: for any funtion f(x)=nx-1 [e.g. 2x-1] f(x)==>3[f(x)]+1==>3[f(2x)]+1==>(3[f(2x)]+1)/2)==>f(x) eg continuing with 2x-1 and compared with nx-1 2x-1 OR nx-1 > 3(nx-1)-1 3nx-2 > 3n(2x)-2 6nx-2 > (6nx-2)/2 3nx-1 [d] EXPLANATION: a- checking for numbers divisible by 4 will always end you up on a previously checked number. b- the expressions are REWRITTEN to fit the Arthmetic sequence c- the entire progression are even numbers d- since n represents any number at all it means the cycle can repeat repeatedly until the set of all integers are eliminated

  • @morrning_group
    @morrning_group 17 дней назад

    Thank you for this incredibly informative video on Kolmogorov Arnold Networks! 🤯💻 It's such a deep dive into machine learning concepts, and I appreciate how you break down complex ideas into understandable explanations. 🌟 I'm curious about the future direction of your channel. 🚀 Are there plans to delve deeper into specific machine learning architectures like Kolmogorov Arnold Networks, or will you explore a broader range of topics within the field? 🤔📈 Additionally, do you have any upcoming collaborations or special projects in the pipeline that your viewers can look forward to? 🌐🔍 Keep up the fantastic work!

  • @thephilosophyofhorror
    @thephilosophyofhorror 18 дней назад

    Not about the proof, but Pythagoras lived approximately 2500 years ago.

  • @shoopinc
    @shoopinc 19 дней назад

    Would this apply only to the Florida state schools? Or to private institutions as well? That is where the great research will concentrate if tenure is taken away from public institutions.

    • @JoelRosenfeld
      @JoelRosenfeld 19 дней назад

      This legislation applies to public universities and colleges. Private institutions are not impacted by this. However, private institutions frequently follow the example of public institutions with a bit of a lag. Also, right now there are not a lot of strong PhD granting private institutions in Florida. I think UM is the only one.

  • @octaviusp
    @octaviusp 20 дней назад

    do you have any machine learning theory courses that follows this series of videos? I would happy to see a Machine learning theory course from you.

    • @JoelRosenfeld
      @JoelRosenfeld 19 дней назад

      Not a formal course yet. It’s something I plan on doing in the future. For now, I am working on a sort of complete series here on RUclips

  • @kabuda1949
    @kabuda1949 21 день назад

    Hello. What if the gram matrix is Positive semi-definite. i.e., at least one eigenvalue is zero. Then how do we find the weights? - I believe the matrix is not invertible. Do we use factorization techniques LU factorization to approximate the weights?

    • @JoelRosenfeld
      @JoelRosenfeld 21 день назад

      Usually we regularize these ill posed problems. The most common method is ridge regression, which essentially adds lambda I to the matrix before inversion

  • @Fox0fNight
    @Fox0fNight 22 дня назад

    I've tried my hand at the Collatz Conjecture recently as well for a bit and I was excited to see someone also come up with the idea of representing the original integer as a sum of powers of two divided by 3^m. Other things I did was noticing that 2^n-1 always goes to 3^n-1, and noticing that repeated division by two is similar to modular arithmetic, as you're looking at the multiplicative remainder of log2(x). After that I was looking for a way to decompose log2(x+y) and got on tangent about the operation x⌄y=ln((e^x+e^y)/2), which distributes addition, i.e. c+(x⌄y)=(x+c)⌄(y+c), and looked at what happens when you change the base, which I found is best to do with ((nx)⌄(ny))/n, as when n -> infinity the operation become max(x,y), when n -> -inf it becomes min(x,y) and when n -> 0 it becomes (x+y)/2. Looks like a paper bending up and down on the 3D graph and it is basically just a bunch of ln(cosh(nx))/n stacked next to each other translated linearly

  • @yihanwang2233
    @yihanwang2233 23 дня назад

    KAN draws inspiration from the Kolmogorov-Arnold representation theorem, though it significantly diverges from and falls below the theorem's original intent and content. It confines its form to compositions of sums of single-variable smooth functions, representing only a tiny subset of all possible smooth functions. This confinement eliminates, by design, the so-called curse of dimensionality. However, there is no free lunch. It is seriously doubtful that this subset is dense within the entire set of smooth functions --- though I have not come up with an example yet. If it is indeed not dense, KAN will not serve as a universal function approximator, unlike the multilayer perceptron. Nonetheless, it may prove valuable in fields such as scientific research, where many explicitly defined functions tend to be simple, even if they do not approximate all possible smooth functions.

    • @JoelRosenfeld
      @JoelRosenfeld 22 дня назад

      Since I saw your message last night, I have been thinking about this. I think universality is going to be fine. The inner functions in the Kolmogorov Arnold Representation are each continuous. Splines can arbitrarily well approximate continuous functions over a compact set, so we could approximate each one to say some epsilon >0. The triangle inequality tells us that the overall error between those approximations is bounded by n times epsilon, where n is the dimension of the space. So the approximation of the inner function is fine. The only twist comes with the outer function. The inner functions are all continuous images of compact sets, so if we look at the outer functions restricted to these sets, we are looking for an approximation on the compact image of the inner functions. We can get a spline approximation of the outer functions like that as well that is within some prescribed epsilon. To make sure everything meshes together well, you need something like Lipschitz continuity on the outer functions. That has never been included in the description of them, because the theorems are for general functions, rather than being restricted to smooth functions or other classes. Picking through the proofs, I think it'd be straightforward to get Lipschitz conditions on the outer functions, when the function you are representing is also Lipschitz. With all of those together, I think that basically takes care of what you would need for universality.

    • @yihanwang2233
      @yihanwang2233 22 дня назад

      @@JoelRosenfeld You rationale is based on multidimensional smooth function approximation. This is precisely what does not apply in this situation. Each function in KAN, no matter which layer, is one-variable and smooth. The latter property prevents the original proof of the Kolmogorov-Arnold theorem, and the former prevents the Taylor expansion proof for Sobolev space functions approximation which I think you are talking about, to go through. Moreover, there is no essential distinction in KAN between the inner and outer function unlike in the KA theorem. The layers are simply recursive stacking. You are trading off between the curse of dimension from universality and simplicity. There is no free lunch.

    • @JoelRosenfeld
      @JoelRosenfeld 22 дня назад

      @@yihanwang2233 Ok, I'll give it some more thought. I personally think that there is a good chance of universality pulling through here. But, you never know until you have a proof or a counter example.

    • @yihanwang2233
      @yihanwang2233 22 дня назад

      @@JoelRosenfeld Examine theorem 2.1, which is the crux theorem, of the KAN paper and see what the premise is as well as its proof steps to see universality is nowhere to be found. To be honest the authors of the paper should have made this point much clearer instead of letting only the experts decipher their claim.

  • @hansisbrucker813
    @hansisbrucker813 23 дня назад

    I learned so much today. Btw, can I say that I *love* that Spider-Man poster behind you 😁

    • @JoelRosenfeld
      @JoelRosenfeld 22 дня назад

      Glad you like the video! Yeah I’m a big spidey fan lol

  • @sashayakubov6924
    @sashayakubov6924 24 дня назад

    Hold on, I'm reading Wikipedia, turns out that Karatsuba (the one who invented his multiplication algorithm) is a student of Kolmogorov!!!

  • @ChristopherWansing
    @ChristopherWansing 24 дня назад

    Super interesting. This is something I have been thinking about a lot recently. I wonder how it would be possible to combine this method with further observables that have a strong influence on the actual trajectory. It seems this method is able to derive the "true"/most likely trajectory from many examples of trajectory start-/endpoints alone, but of course it would be much more powerful if it were able to make use of other correlated observations as well. What are your thoughts on this?

  • @sashayakubov6924
    @sashayakubov6924 24 дня назад

    Both Russian scientists? Kolmogorov is OK, but "Arnold" does not sound like a Russian surname at all.

    • @JoelRosenfeld
      @JoelRosenfeld 24 дня назад

      Vladimir Arnold was indeed a citizen of the USSR. The Soviet government actually interceded to prevent him from getting a fields medal because Arnold spoke out against their treatment of dissidents.

  • @momolight2468
    @momolight2468 25 дней назад

    This is unbelievable!! Thanks for this very beneficial deep dive. I can not thank you enough for the second website for data acquisition! I was genuinely struggling since the first website seems to be temporarily closed! Keep up the fantastic work!

  • @InstaKane
    @InstaKane 25 дней назад

    Right…..

  • @mrpocock
    @mrpocock 25 дней назад

    I don't think there's anything in principle preventing a KAN layer or layers from being put inside a normal deep network. So there may be a space of interesting hybrids that do interesting things. For example, the time-wise sum of a time- wise convolution with a (2 layer?) can learn to perform attention, without needing all those horrible fourier features.

  • @hughsalter7769
    @hughsalter7769 26 дней назад

    thanks

  • @hughsalter7769
    @hughsalter7769 26 дней назад

    what is r1?

    • @JoelRosenfeld
      @JoelRosenfeld 26 дней назад

      In the United States there are a collection of universities, which are called Doctoral Granting Research Universities. They are essentially categorized at R1, R2, and “Doctoral Granting.” R1 universities are those that have the greatest research output of all the categories.

  • @ycyang2698
    @ycyang2698 26 дней назад

    Appreciate your effort making serious videos

  • @germangonzalez3063
    @germangonzalez3063 27 дней назад

    The way you teach is impressive. I have been around understanding all this and finally came into your channel.

    • @JoelRosenfeld
      @JoelRosenfeld 27 дней назад

      I’m glad you like it! It takes a lot of effort to put this all together. Happy to have you here!

  • @avigailhandel8897
    @avigailhandel8897 29 дней назад

    Hlo! (54 yr old) taking graduate level numerical analysis in the fall. Any suggestions how to prepare over the summer?

    • @JoelRosenfeld
      @JoelRosenfeld 28 дней назад

      There are a lot of resources online that can help get you ready. MIT open courseware also has a bunch of lectures and study materials, like worksheets and exams, that you can use. At the grad level, numerical analysis can vary quite a lot, but you can take your time to familiarize yourself with the standard topics.

  • @naninano8813
    @naninano8813 29 дней назад

    4:19 i have exact same book in my library but never made a connection tbh that current KAN hype in AI world is connected the same Arnold

    • @JoelRosenfeld
      @JoelRosenfeld 29 дней назад

      It’s really a great textbook. Arnold did a lot of great work, and was nominated for a Fields medal back in 1974. I’ll talk more about it in my next video.

  • @BN-hy1nd
    @BN-hy1nd 29 дней назад

    I am a retired teacher of secondary school maths in the UK. I couldn't have done this. Well done, ladies😍

  • @AT.inbetween
    @AT.inbetween 29 дней назад

    fresh eyes! congrats!! goes further back to Egypt

  • @stretch8390
    @stretch8390 29 дней назад

    Man, what a channel. Awesome to find videos not shying away from technical topics.

  • @nicolaskrause7966
    @nicolaskrause7966 Месяц назад

    Thanks for the video, I'd attacked the problem awhile after you and had followed a similar path. Nice to see that I wasn't totally off base even if I got stuck a bit before you put together your inverted representation!

    • @nicolaskrause7966
      @nicolaskrause7966 Месяц назад

      I think my favourite fact was finding the binary pattern 10101.... which corresponds to numbers n such that 3n+1 = 2^m . 5 is a good example, 3*5+1 = 16. When performing the addition in binary, it was like a zipper closing up!

  • @David-pq6wt
    @David-pq6wt Месяц назад

    I enjoy your videos. I have a bachelor's in mechanical engineering and have been out of school working for 10 years. I always loved math and wanted to take real analysis in school but it dudt work out. For the past few month I have been going thru calc again with George Simmons book with the hopes of going thru understanding analysis by Abbott after that.its so satisfying when your work hard on a problem and finally get it. Looking forward to more videos

  • @vtrandal
    @vtrandal Месяц назад

    I take an applied math perspective and a nonrigorous one at that. I don't use the Nyquist theorem as a theorem pe se, but as a criterion or requirement for sampling a band-limited signal fast enough so that periodic images of the sampled signal do not overlap in the frequency domain. I prove Shannon's sampling theorem as a special case of the convolution theorem when one of the functions is an ideal sampling function (Dirac comb). Sampling a band-limited signal (compact support in the frequency domain) results a function with a periodic spectrum by application of the convolution theorem and the fact the Fourier transform of a Dirac comb is also a Dirac comb (but in the frequency domain). Convolving a Dirac comb with a bandlimited signal results in a function with a periodic spectrum. QED. The Nyquist criterion then becomes the requirement the signal be sampled fast enough to spread the periodic images of the sampled signal apart in the frequency (so the images do not overlap in the frequency domain and cause frequency folding).

  • @tamineabderrahmane248
    @tamineabderrahmane248 Месяц назад

    i think that KAN will be stronger than MLP in physics informed neural networks

  • @Hyperion1722
    @Hyperion1722 Месяц назад

    the square within a square is a more elegant proof.

    • @JoelRosenfeld
      @JoelRosenfeld Месяц назад

      Certainly, but this isn’t about making the most elegant proof. Just a new one.

  • @DJWESG1
    @DJWESG1 Месяц назад

    I thought the point was to embed a core logic to reduce size.

  • @1NV4S10N
    @1NV4S10N Месяц назад

    Do you think no one else made an attempt before them as opposed to couldn't figure it out. There was a guy in 2009 that also found the answer.

  • @tablen2896
    @tablen2896 Месяц назад

    Hey, great video. If you take constructive(?) criticism (and you may have already noticed when editing), you should reduce the sensitivity of focus adjustment of your recording device, or set it to a fixed value. It tries to focus the background (computer screen) and foreground (your hands) instead of you.

    • @JoelRosenfeld
      @JoelRosenfeld Месяц назад

      lol yeah I noticed. Actually put b roll over some especially bad spots. I almost recorded again. I’ll look into it. Something I struggle with

  • @avyuktamanjunathavummintal8810
    @avyuktamanjunathavummintal8810 Месяц назад

    Great video! Eagerly awaiting your next one! :)

    • @JoelRosenfeld
      @JoelRosenfeld Месяц назад

      Me too lol! I’ve been digging through the proofs to find one that is digestible for a RUclips video. I actually had a whole recording done only to realize it wouldn’t work too well. Trying to give the best video possible :)

    • @avyuktamanjunathavummintal8810
      @avyuktamanjunathavummintal8810 5 дней назад

      @@JoelRosenfeld , (I understand your desire for perfection, but) you know, I'd rather you upload that recorded video. 😅:)

    • @JoelRosenfeld
      @JoelRosenfeld 5 дней назад

      @@avyuktamanjunathavummintal8810 I appreciate that. I should have a new video up this weekend. Still need time to make that video breaking down the theory, but I have something that’ll hopefully bridge the gap a little.

  • @Adventure1844
    @Adventure1844 Месяц назад

    its an old method from 2021: ruclips.net/video/eS_k6L638k0/видео.html

    • @JoelRosenfeld
      @JoelRosenfeld Месяц назад

      The use of Kolmogorov Arnold Representations as a foundation for Neural Networks goes back at least as far as the 1980s and 1990s. In fact, I think the Kolmogorov Arnold Representation as a two layer neural network appeared in the first volume of the Journal Neural Networks. You can find more if you look into David Sprecher’s work, who has worked on this problem for 60 years. The innovation for this work comes in the form of layering, and the positions it as an alternative to deep neural networks, but with learnable activation functions.

  • @peterhall6656
    @peterhall6656 Месяц назад

    This sort of high level discussion is always seductive. I look forward to the nuts and bolts to see whether the first date performance can be sustained.....

    • @JoelRosenfeld
      @JoelRosenfeld Месяц назад

      Absolutely, we are still seeing the beginnings of this method. I’m optimistic, but we will see!

  • @mikebibler6556
    @mikebibler6556 Месяц назад

    I'm here after the 60 Minutes story finally arrived in my content algorithm... an entire year after-the-fact. Apparently I've been focused on old-white-male-politics for the past few years to be rewarded with content I actually like.

  • @petersuvara
    @petersuvara Месяц назад

    Who needs maths when AI does it all for us. Oh... yeah... it's not AI. :) It's MATHS! :)

  • @reversicle212
    @reversicle212 Месяц назад

    awesome vid man