官术网_书友最值得收藏!

Algorithm

Now, we will better understand the Gaussian prototypical network by going through it step by step:

  1. Let's say we have a dataset, D = {(x1, y1,), (x2, y2), ... (xi, yi)}, where x is the feature and y is the label. Let's say we have a binary label, which means we have only two classes, 0 and 1. We will sample data points at random without replacement from each of the classes from our dataset, D, and create our support set, S.
  2. Similarly, we sample data points at random per class and create the query set, Q.
  3. We will pass the support set to our embedding function, f(). The embedding function will generate the embeddings for our support set, along with the covariance matrix.
  4. We calculate the inverse of the covariance matrix.
  5. We compute the prototype of each class in the support set as follows:

In this equation, is the diagonal of the inverse covariance matrix, denotes the embeddings of the support set and superscript c denotes the class.

  1. After computing the prototype of each class in the support set, we learn the embeddings for the query set, Q. Let's say x' is the embedding of the query point.
  2. We calculate the distance of the query point embeddings to the class prototypes as follows:
  1. After calculating the distance between the class prototype and query set embeddings, we predict the class of the query set as a class that has a minimum distance, as follows:
主站蜘蛛池模板: 临沧市| 凤翔县| 团风县| 太仆寺旗| 大余县| 五家渠市| 泽州县| 专栏| 高清| 凯里市| 荣昌县| 通州区| 诸暨市| 黄骅市| 老河口市| 博客| 肇东市| 广州市| 鲁山县| 吕梁市| 黄浦区| 铜鼓县| 中江县| 微山县| 白水县| 黑山县| 云霄县| 鄂托克旗| 永川市| 吴旗县| 华坪县| 沈阳市| 石林| 永靖县| 上林县| 蒲江县| 台安县| 石景山区| 永年县| 武定县| 乐安县|