Grid Search and Random Search

created:: 2023-09-23T22:05:50
up:: Hyperparameter Optimization
tags:: #🌱 #machine-learning

Grid and Random search are both ways to do Hyperparameter Optimization. Examples you have multiple configs or settings to try,

# xgb hyperparameter
parameters = {  
	"n_estimators": [10, 50, 100],  
	"subsample":[0.6, 0.8, 1],  
	"learning_rate":[0.01, 0.1, 0.5, 1],  
	"gamma":[0.01, 0.1, 1, 5],  
	"colsample_bytree":[0.5, 0.7, 0.9, 1],  
	"alpha":[0, 0.1, 0.5]  
}

with grid search, you will sequentially try each value and see what is the best config for the best result. This usually takes a long time to do because we literally explore each value one by one.

with random search, you will take each one knob randomly for n times and see what is the best outcome of all those multiple sequence of trying. This method doesn't took that much time compared to grid search.

this two methods is limited and biased because we don't know if there're any other values that can be better than your current config. Below is the example.

!300

in conclusion, both ways aren't that good compared to other advanced Hyperparameter Optimization Techniques.


Resources