Grid Search — Simple Hyper-parameter Search for Machine Learning

Paras Koundal
4 min readJul 23, 2021

Exhaustive and Randomized Grid Search

Learn Something New: Manuscript fragment in Prakrit, on 4 leaves of oblong paper. With colored marginal miniatures of Jaina saints and deities; text written in gold on alternating red and blue backgrounds. From The New York Public Library — https://on.nypl.org/2TwHUeU

A Machine Learning(ML) project starts with laying out a project outline and ends with deployment. These are accompanied by a multitude of intermediate steps like data collection, data preprocessing, choosing appropriate ML method, hyper-parameter selection, optimization and interpretation. As hyper-parameters are not directly learnt within models, hyper-parameter selection and optimization is one which consumes a considerable amount of computational as well as human-time. However, a lot of times practitioners rely on manual fiddling of hyper-parameters for fine tuning their models. As models becomes bigger and more complex, manual fiddling become infeasible. Grid Search assists us by providing a simple automated solution to this problem.

Grid Search optimizes the model for best accuracy by automating the process of constructing set of hyperparameter pairs from list of individual lists of hyperparameters.

Example:
Say I have am training a neural network and want to select whether to go with a deeper model (more number of hidden layers) , a wider one (more number of hidden nodes) or an intermediate one.
Standard Way: Try out each of them and see which works.
Issue: Normally when training a model, size (deeper or wider) is not the only hyperparameter…

--

--

Paras Koundal

Hi there. 👋🏾I am a Particle Physicist working in the in the field of Cosmic-Ray Physics at IceCube Neutrino Observatory in Antarctica. Visit paraskoundal.com