This weekend I'm back studying Chapter 5 (Decision Trees) from Hands-On Machine Learning. I took about a month off.
On the Subject
Decision Trees
Decision Trees are satisfying to study. The logic is clean. You can visualize exactly how predictions are made. It feels structured and practical.
The Core Loop
- → Split the data — find the feature and threshold that best separates the classes
- → Measure impurity — Gini or entropy tells you how mixed a node is
- → Control depth — max_depth and min_samples are your primary levers
- → Avoid overfitting — a tree that memorizes training data is useless on new data
The Plan
Steady Progress
My goal is simple: finish this book by the end of the month and continue moving forward in my ML studies. No overthinking it. Just steady progress.
Fueled by cold brew. The tree will be built.

Comments
Post a Comment