A training method of inductive embedding representation proposed by William.
The GCN mentioned in the last article is transformational learning, that is, the embedding of each node is directly trained through a fixed graph. But in many graphic scenes, the graphic nodes are updated in real time, so this paper puts forward inductive learning. Instead of training the embedding of each node on a static graph, a mapping relationship (aggregator) from neighbors to embedding is obtained through training, so that the embedding of new nodes can be obtained only through neighbor relationship.
Training loss of paired unsupervised learning. It is considered that the embedding between two connected nodes should be close, while the embedding between two unconnected nodes should be far. (Cosine distance is expressed by inner product)
The training data set contains the feature information of each node, and then the user's nodes are trained with the feature information. What if there is no feature information? Is it expressed by an index?
The overall framework of this paper is still GraphSage: extracting a certain amount from neighbors for fusion. But it is different from a graph. It is not a random sampling, but an important collection. The main innovations of this paper are as follows:
The overall framework of this paper is actually very classic:
This article is an improvement on the above NGCF. It is found that the feature transformation (W 1, W2) and nonlinear activation () in NGCF not only increase the difficulty of training, but also reduce the accuracy. The main reasons are as follows: GCN was originally proposed for the classification of nodes on the attribute graph, in which each node has rich attributes as input features; In CF's user-project interaction diagram, each node (user or project) is only described by a unique ID, and there is no specific semantics except as an identifier. In this case, given the ID embedding as input, performing multi-layer nonlinear feature transformation, which is the key to the success of modern neural networks, will not bring any benefits, but will negatively increase the difficulty of model training.
Optimizing the forward propagation layer of light;
Note: Self-join is not added in forward propagation, because the weighted sum operation of layer embedding actually includes self-join. The specific proof process is as follows:
Therefore, inserting the self-connection in A and propagating the embedding on it is essentially equivalent to the weighted sum of the embedding propagated in each LGC layer.
This paper aims to solve two problems:
Therefore, MGNN-SPred jointly considers the target behavior and auxiliary behavior sequence, and explores the relationship between global projects to achieve accurate prediction.
The algorithm framework of this paper:
Synthesis algorithm:
Project representation learning:
For each node, a thermal representation;
Sequence representation learning:
We find that a simple average pool can achieve the performance equivalent to the attention mechanism, while maintaining low complexity.
It goes without saying that the contribution of auxiliary behavior sequence to the next prediction is different under different circumstances. Therefore, a gating mechanism is designed to calculate the relative importance weight:
One-key representation of the basic facts.
This paper aims to solve the problem of sequential recommendation. The main contributions are as follows:
First, use the sliding window strategy to take out the subsequences in the sequence, and then add edges for each subsequence, as shown in the following figure.
An external storage unit is used to store the long-term interests of users that change with time. The order of user contact time is:.
Generating Query Embedding by Using Multidimensional Attention Model:
Among them, the sinusoidal position coding function maps the project position into position embedding.
Then operate the storage unit:
Superscript c indicates the integration of long-term and short-term interests.
Represents the relationship between the items contacted in a short time and the required items.