官术网_书友最值得收藏!

Algorithm

Now, we will better understand the Gaussian prototypical network by going through it step by step:

  1. Let's say we have a dataset, D = {(x1, y1,), (x2, y2), ... (xi, yi)}, where x is the feature and y is the label. Let's say we have a binary label, which means we have only two classes, 0 and 1. We will sample data points at random without replacement from each of the classes from our dataset, D, and create our support set, S.
  2. Similarly, we sample data points at random per class and create the query set, Q.
  3. We will pass the support set to our embedding function, f(). The embedding function will generate the embeddings for our support set, along with the covariance matrix.
  4. We calculate the inverse of the covariance matrix.
  5. We compute the prototype of each class in the support set as follows:

In this equation, is the diagonal of the inverse covariance matrix, denotes the embeddings of the support set and superscript c denotes the class.

  1. After computing the prototype of each class in the support set, we learn the embeddings for the query set, Q. Let's say x' is the embedding of the query point.
  2. We calculate the distance of the query point embeddings to the class prototypes as follows:
  1. After calculating the distance between the class prototype and query set embeddings, we predict the class of the query set as a class that has a minimum distance, as follows:
主站蜘蛛池模板: 湘潭县| 昌乐县| 义马市| 沁源县| 交城县| 屏东县| 富川| 高台县| 桦南县| 桂东县| 红原县| 柯坪县| 清涧县| 隆昌县| 望江县| 开鲁县| 淄博市| 兴隆县| 鄂尔多斯市| 当阳市| 宁远县| 黄冈市| 乌恰县| 桦川县| 额济纳旗| 柳河县| 西藏| 雷州市| 三江| 丘北县| 通海县| 瑞丽市| 保定市| 托克逊县| 体育| 蒙自县| 巫山县| 岗巴县| 阜南县| 清涧县| 巩义市|