Start your deep learning journey with Andrew Ng here:

All the credit goes to the Lex Fridman Podcast. make sure to go and check these full episodes out and make it a habit to listen to this insightful podcast.

Best SQL Course Online (Medium article):

In this video, I’ve compiled some of the best machine learning advice from the Lex Fridman Podcast. I tried to feature the brightest minds of our time that are currently working on AI like Andrew Ng, George Hotz, Yann lecun, Ilya Sutskever, Sam Altman and Demis Hassabis amongst many others.

Hope you enjoy it and get enough motivation out of it to catapault you in your AI learning Journey.

00:00:00 Andrew Ng
00:00:33 Goerge Hotz
00:00:59 Andrew Ng
00:01:21 Andrej Karpathy
00:01:54 Andrew Ng
00:02:12 George Hotz
00:02:50 Joscha Bach
00:03:04 Ilya Sutskever
00:03:26 Yann LeCun
00:04:08 Demis Hassabis
00:05:00 Ilya Sutskever
00:05:40 Wojciech Zaremba
00:06:02 Sergey Levine
00:06:33 Yann LeCun
00:07:01 Ilya Sutskever
00:07:31 Yann LeCun
00:08:53 Sam Altman

Links to original videos:
Andrew Ng:
George Hotz:

Ishan Misra:

Andrew Karpathy:

Joscha Bach:

Ilya Sutskever:

Yann Lecun:

Demis Hassabis:

Wojciech Zaremba:

Sergey Levine:

Sam Altman:

source

div style="text-align: center;">
47 thoughts on “Genius Machine Learning Advice for 10 Minutes Straight”
  1. Here's some advice: start now! Just do it. Stop wasting time trying to do things perfectly, because over time small and imperfect steps can really add up to a big improvement in your skills.

  2. Business teams ask these kind of questions because they don’t know what ML can do. They don’t understand . It’s the job of PM who understands ML/AI to work with customers to understand what problems of the customer ML can solve.

  3. 00:04 Don't waste time collecting unnecessary data
    01:27 Dive deep into problem-solving for effective learning
    02:42 Becoming an expert through 10,000 hours of deliberate work
    04:00 Choose creation over consumption for satisfaction and impact
    05:26 Key advice for Next Generation
    06:52 Reimplementing at different levels of abstraction is a powerful way to understand machine learning.
    08:09 Machine learning embraces sloppiness through cost functions
    09:34 Intelligence is inseparable from learning

  4. I like that the video ends with Sam Altman speaking on the need to be cautious with taking advice from people and knowing what one actually wants, after the long series of advices

  5. i was a hacker, then an engineer, then a scientist. ML only works if you "do the science". test different things and KEEP RECORDS. Constantly test hypotheses and look at the rate of change. Everything else is just guess work. you can get lucky, or you can get good and consistent.

  6. Deep learning is constrained by the absurdly inefficient backpropagation algorithm.

    As an alternative try the Neural Network Builder, NNB, which is lighting fast with zero output error while controlling overfitting and underfitting at will. There is a free DEMO software with a graphical interface for Windows.

    Since YouTube erases the links that I could place in this comment (it happened in the past) you will have to look for the NNB REFERENCES article in the HAL Open Science platform and other open research and academic sites.

    The crucial fact is that classical neural networks (with any number of perceptron layers, and any number of perceptron units in each layer) are equivalent to geometric polyhedrons. Instead of using the data vectors to backpropagate the error of an initial network chosen in a blind manner by chance or "intuition", use the data to constructive and economically define a polyhedron K that fits the data, then trivially translate K into the three layer deep network N that solves your recognition problem. Yes, three layers suffice; forget the hundred layer nightmares. This approach also provides rule extraction, although this feature is not available in the DEMO.

    Around backpropagation a large culture has grown that creates all sorts of barriers including intellectual inertia, personal success, financial interests, dedicated processors, silly investments like NVIDIA, and other flows of money.

    It is a natural law that the backpropagation culture will resist renovation by looking away from the facts.

    But facts will eventually prevail.

    Best regards to all
    Daniel Crespin

  7. In a world of loss functions, your video is a series of local minima, each capturing moments of greatness inspired by unique experiences. These points of progress, shaped by data and lessons, guided by attention, feed the journey forward to the global minimum, propagating toward a true general representation in latent spaces.

Leave a Reply

Your email address will not be published. Required fields are marked *