- Hands-On Meta Learning with Python
- Sudharsan Ravichandiran
- 245字
- 2021-07-02 14:29:22
Algorithm
Now, we will better understand the Gaussian prototypical network by going through it step by step:
- Let's say we have a dataset, D = {(x1, y1,), (x2, y2), ... (xi, yi)}, where x is the feature and y is the label. Let's say we have a binary label, which means we have only two classes, 0 and 1. We will sample data points at random without replacement from each of the classes from our dataset, D, and create our support set, S.
- Similarly, we sample data points at random per class and create the query set, Q.
- We will pass the support set to our embedding function, f(). The embedding function will generate the embeddings for our support set, along with the covariance matrix.
- We calculate the inverse of the covariance matrix.
- We compute the prototype of each class in the support set as follows:
In this equation, is the diagonal of the inverse covariance matrix, denotes the embeddings of the support set and superscript c denotes the class.
- After computing the prototype of each class in the support set, we learn the embeddings for the query set, Q. Let's say x' is the embedding of the query point.
- We calculate the distance of the query point embeddings to the class prototypes as follows:
- After calculating the distance between the class prototype and query set embeddings, we predict the class of the query set as a class that has a minimum distance, as follows: