Random Forest
- A deep tree with lots of leaves might overfit and a shallow tree with few leaves might underfit.
→ Random Forest is an example of models that can lead to better performances.
- Random Forest uses many trees, and it makes a prediction by averaging the predictions of each component tree.
- One of the best features of these methods is that they generally work reasonably even without tunings like parameter tunings.
from sklearn.ensemble import RandomFroestRegressor
from sklearn.metrics import mean_absolute_error
forest_model = RandomForestRegressor(random_state=1)
forest_model.fit(train_X, train_y)
melb_preds = forest_model.predict(val_X)
print(mean_absolute_error(val_y, melb_preds))
'Kaggle' 카테고리의 다른 글
<Kaggle > Learn- Intro to Machine Learning (Underfitting and Overfitting) (2) | 2024.11.07 |
---|---|
<Kaggle > Learn- Intro to Machine Learning (Model Validation) (2) | 2024.11.05 |