Manually scouring the unlimited permutations and combinations of hyperparameters can be tiring for your model’s “not so substantial” (but still significant) improvement.
Several techniques have been established through trials that you can implement to get the best set that suits your needs and requirements. Let us look at some strategies:
Grid Search: One of the brute force methods, grid search is the most basic algorithm for hyperparameter tuning. Essentially, we divide the domain of the hyperparameters into a discrete grid. Then, using cross-validation, we try every possible combination of grid values. The optimal combination of hyperparameter values is the grid point that maximizes the average value in cross-validation.
Random Search: Random search is like grid search, but instead of testing all of the points in the grid, it only tests a random subset of them. The optimization will be faster, but this subset’s more minor will be less accurate. The more precise the optimization is, but the more it looks like a grid search, the bigger this dataset is.