Grid Search — Simple Hyper-parameter Search for Machine Learning
Exhaustive and Randomized Grid Search
A Machine Learning(ML) project starts with laying out a project outline and ends with deployment. These are accompanied by a multitude of intermediate steps like data collection, data preprocessing, choosing appropriate ML method, hyper-parameter selection, optimization and interpretation. As hyper-parameters are not directly learnt within models, hyper-parameter selection and optimization is one which consumes a considerable amount of computational as well as human-time. However, a lot of times practitioners rely on manual fiddling of hyper-parameters for fine tuning their models. As models becomes bigger and more complex, manual fiddling become infeasible. Grid Search assists us by providing a simple automated solution to this problem.
Grid Search optimizes the model for best accuracy by automating the process of constructing set of hyperparameter pairs from list of individual lists of hyperparameters.
Example:
Say I have am training a neural network and want to select whether to go with a deeper model (more number of hidden layers) , a wider one (more number of hidden nodes) or an intermediate one.
Standard Way: Try out each of them and see which works.
Issue: Normally when training a model, size (deeper or wider) is not the only hyperparameter…