官术网_书友最值得收藏!

  • Deep Learning with Keras
  • Antonio Gulli Sujit Pal
  • 179字
  • 2021-07-02 23:58:06

Hyperparameters tuning

The preceding experiments gave a sense of what the opportunities for fine-tuning a net are. However, what is working for this example is not necessarily working for other examples. For a given net, there are indeed multiple parameters that can be optimized (such as the number of hidden neurons, BATCH_SIZE, number of epochs, and many more according to the complexity of the net itself).

Hyperparameter tuning is the process of finding the optimal combination of those parameters that minimize cost functions. The key idea is that if we have n parameters, then we can imagine that they define a space with n dimensions, and the goal is to find the point in this space which corresponds to an optimal value for the cost function. One way to achieve this goal is to create a grid in this space and systematically check for each grid vertex what the value assumed by the cost function is. In other words, the parameters are pided into buckets, and different combinations of values are checked via a brute force approach.

主站蜘蛛池模板: 衡东县| 珲春市| 财经| 连城县| 三台县| 长丰县| 铜川市| 济南市| 清涧县| 仙游县| 象州县| 和林格尔县| 湘乡市| SHOW| 聂荣县| 盐城市| 鹰潭市| 民县| 肥西县| 新营市| 越西县| 绥芬河市| 杂多县| 安顺市| 大姚县| 崇义县| 敦煌市| 南皮县| 邳州市| 平遥县| 织金县| 辽宁省| 安多县| 吉隆县| 司法| 唐海县| 四子王旗| 靖西县| 吉安市| 郧西县| 大渡口区|