All the materials are available in the below link
Visit for data sscience blogs
Time Stamp:
00:00:00 Introduction
00:01:25 AI Vs ML vs DL vs Data Science
00:07:56 Machine LEarning and Deep Learning
00:09:05 Regression And Classification
00:18:14 Linear Regression Algorithm
01:07:14 Ridge And Lasso Regression Algorithms
01:33:08 Logistic Regression Algorithm
02:13:52 Linear Regression Practical Implementation
02:28:30 Ridge And Lasso Regression Practical Implementation
02:54:21 Naive Baye’s Algorithms
03:16:02 KNN Algorithm Intuition
03:23:47 Decision Tree Classification Algorithms
03:57:05 Decision Tree Regression Algorithms
04:02:57 Practical Implementation Of Deicsion Tree Classifier
04:09:14 Ensemble Bagging And Bossting Techniques
04:21:29 Random Forest Classifier And Regressor
04:29:58 Boosting, Adaboost Machine Learning Algorithms
04:47:30 K Means Clustering Algorithm
05:01:54 Hierarichal Clustering Algorithms
05:11:28 Silhoutte Clustering- Validating Clusters
05:17:46 Dbscan Clustering Algorithms
05:25:57 Clustering Practical Examples
05:35:51 Bias And Variance Algorithms
05:43:44 Xgboost Classifier Algorithms
06:00:00 Xgboost Regressor Algorithms
06:19:04 SVM Algorithm Machine LEarning Algorithm
———————————————————————————————————————
►Data Science Projects:
►Learn In One Tutorials
Statistics in 6 hours:
Machine Learning In 6 Hours:
Deep Learning 5 hours :
►Learn In a Week Playlist
Statistics:
Machine Learning :
Deep Learning:
NLP :
►Detailed Playlist:
Stats For Data Science In Hindi :
Machine Learning In English :
Machine Learning In Hindi :
Complete Deep Learning:
source
All the materials are given below https://github.com/krishnaik06/The-Grand-Complete-Data-Science-Materials/tree/main
Visit https://www.krishnaik.in/liveclasses for more live classes
Thank you sir! Because of you I got a job as a data analyst
in Boston house pricing dataset have u done the datapreprocessing?
Super!!!!!
Sir, you are a next to god to help me learn machine learning!
Day 1 : 00:00:00
Day 2 : 01:07:00
Day 3 : 02:13:00
Day 4 : 03:20:00
Day 5 : 04:09:00
Day 6 : 04:47:00
sir i can't find note so plz send link to download note. thank you sir for this great video
Wonderful explanation for machine learning, thank you Krish
best explaination!!!!!!!!!!!!!!!!!!!
5:38:40 little correction the model performed well with training data; it has low bias
and model performed poor with training data; it has high bias
I don't know if my comment gets an attention or not but I just wanted to say that krish naik sir ki ek ek video itni jyada acchi h and main sbko recommend krungi ye videos agr apko data science data analyst machine learning ki intern and job Leni h. Krish sir …thank you soo much…apki videos se meko foreign me internship mil gyi h…apki videos is so so good…thank you so much…and these are so far the best videos in terms of everything when it comes to learning❤❤❤❤
Sir Mathematical formulas yaad rakhna zaroori hai kya? Interviews me pooch sakte hai kya? Please reply sir… i really want to know because I am preparing for interviews.
can u suggest me which data set are u using in this video
41:42
1:33:12
Butt what about python or R?😢
Does this also contain PCA?
very basic concepts, good for only sem exams
If anyone wants notes for this lecture reply me……
Watching for Gate 2025 GATE DA Exam
your conditional probabilty of yes gievn that sunny is wrong
Is this useful for gate da?
Understanding R-Squared and Adjusted R-Squared
Scenario
Suppose we are working on a problem where we aim to predict the price of a house. Initially, we use one feature—the number of bedrooms—and obtain an R-squared (Δ²) value of 85%. This means that 85% of the variation in house prices is explained by the number of bedrooms.
Next, we add another feature—the location of the house—which is strongly correlated with house price. As expected, the R-squared value increases to 90%, indicating an improved model.
Now, we introduce an irrelevant feature, such as the gender of the person living in the house. Gender has no logical relationship with house price, yet the R-squared value increases to 91%.
This happens because R-squared always increases (or remains the same) when new features are added, even if those features are not actually useful. This can lead to a misleading interpretation, as a model with irrelevant variables might appear to perform better than a simpler, more meaningful model.
The Problem with R-Squared
R-squared increases when more features are added, even if they are not useful.
It does not penalize for adding irrelevant variables.
A model with a higher R-squared may not necessarily be the best model.
Solution: Adjusted R-Squared
To counteract this issue, we use Adjusted R-Squared, which modifies the R-squared value by penalizing unnecessary features.
Formula for Adjusted R-Squared
where:
= R-squared value
= Number of observations (data points)
= Number of independent variables (features)
Why Adjusted R-Squared?
Prevents misleading improvement: Unlike R-squared, adjusted R-squared does not always increase when new features are added.
Penalizes unnecessary variables: If an added feature does not improve the model significantly, adjusted R-squared will decrease instead of increasing.
Helps in feature selection: It ensures that only meaningful features contribute to the model’s predictive power.
Effect of Increasing Predictors (P)
As we increase the number of predictors (p), the denominator (n – p – 1) decreases. If the newly added feature is not correlated with the target variable, the numerator (1 – R²)(n – 1) remains large. When dividing a larger numerator by a smaller denominator, the fraction increases, making 1 – (larger fraction) smaller. This leads to a decrease in adjusted R-squared, even though R-squared itself may have increased.
This explains why adjusted R-squared is always less than or equal to R-squared.
Conclusion
While R-squared is a good indicator of model performance, it can be misleading when adding unnecessary features. Adjusted R-squared provides a more reliable evaluation by penalizing irrelevant variables, ensuring the model remains both accurate and interpretable. Additionally, since Adjusted R-Squared accounts for the number of predictors, it will always be less than or equal to R-Squared—an important concept often tested in interviews.
1:15:13 Underfitting should be High Bias,Low Variance
before coming to this video learn basics of graphs of high school
i am begginer so i am watching linear regression but i don't get it so what should i do
thanks sir
Sir, if the value of the weight is already close to nil, then squaring it would make it closer to 0 instead of increasing it. So wouldn't Ridge Regression be a better algorithm for eliminating unimportant features in this case?
I am currently watching this. I am at 12:00 and I am already loving it.
I search a lot of channels about Machine learning but the teaching style was not helping me. But here I really liked his way of explaining everything
17:55 KNN comes under supervised learning
I appreciate the content. It provides in depth clarity as well as connection of next topic with the previous topic covering why it is needed, what it is. This makes the flow of the content easy to grasp and remember. Amazing!
Progress : 2:02
Will Non technical students be fitting into this video, how far Bcom grads can catch up with this video. Any leads?
Hi
the dataset for practical implementation of regression techniques has been removed from the source(scikit datasets)
File ~AppDataLocalProgramsPythonPython313Libsite-packagessklearndatasets__init__.py:161, in __getattr__(name)
110 if name == "load_boston":
111 msg = textwrap.dedent(
112 """
113 `load_boston` has been removed from scikit-learn since version 1.2.
(…)
159 """
160 )
–> 161 raise ImportError(msg)
162 try:
163 return globals()[name]
)