官术网_书友最值得收藏!

Tuning the model hyperparameters

Now that we've trained an MLP and a six-layer deep neural network on the problem, we're ready to tune and optimize model hyperparameters.

We will discuss model tuning in depth in Chapter 6, Hyperparameter Optimization. There are a variety of strategies that you can use to choose the best parameters for your model. As you've probably noticed, there are many possible parameters and hyperparameters that we could still optimize.

If you wanted to fully tune this model you should do the following:

  • Experiment with the number of hidden layers. It appears that five might be too many, and one might not be enough.
  • Experiment with the number of neurons in each hidden layer, relative to the number of layers.
  • Experiment with adding dropout or regularization.
  • Attempt to further reduce model error by trying SGD or RMS prop instead of Adam, or by using a different learning rate for Adam.

Deep neural networks have so many moving parts, getting to optimal is sometimes an exhausting notion. You'll have to decide whether your model is good enough.

主站蜘蛛池模板: 商丘市| 奉新县| 舟山市| 石泉县| 河间市| 普洱| 吴忠市| 施甸县| 赤壁市| 高淳县| 门头沟区| 北宁市| 萍乡市| 台安县| 石林| 邛崃市| 静海县| 祥云县| 孝义市| 临武县| 大名县| 湘乡市| 辽阳市| 衢州市| 神农架林区| 芒康县| 安宁市| 白水县| 万全县| 虎林市| 临湘市| 永昌县| 翼城县| 铜山县| 砀山县| 蒙城县| 鹤岗市| 长丰县| 岢岚县| 京山县| 林口县|